Cognitive Site Visit Report Revised 4/2/2009 (from the ZIP file)

Cognitive Site Visit Report Revised 04-02-09.doc

Survey of Research and Development Expenditures at Universities and Colleges, FY 2006 through FY 2008

Cognitive Site Visit Report Revised 4/2/2009 (from the ZIP file)

OMB: 3145-0100

Document [doc]
Download: doc | pdf

Redesign of the Survey of Research and Development Expenditures at Universities and Colleges



Summary of Winter 2008–2009 Cognitive Site Visits

Authors:


Mary Hagedorn

David Cantor

Jennifer Crafts

Patricia Dean Brick

Liam Ristow



April 2009


Prepared for:


National Science Foundation

Arlington, Virginia


Prepared by:


WESTAT

Rockville, Maryland



Contents

Section Page


1 INTRODUCTION 1-1


1.1 The Redesign Project 1-1

1.2 Site Visit Approach 1-2


2 FINDINGS 2-1


2.1 Cover Page 2-1

2.2 Survey Definitions and Instructions 2-2


2.2.1 Definition of R&D 2-2

2.2.2 Reference to the OMB Circular A-21 2-2

2.2.3 Separately Budgeted Research 2-3

2.2.4 Distinguishing Between Basic Research, Applied Research, and Development 2-4

2.2.5 Inclusion of Non-Science and Engineering Fields (Box at Bottom of Page) 2-6

2.2.6 Please Include… Please Do Not Include 2-7


2.3 Questions 1 and 2: Source of Funds and Foreign Funds 2-12


2.3.1 General Comments 2-12

2.3.2 Question 1: Three Bullets on Top of Page 2-13

2.3.3. Federal and State Funds 2-13

2.3.4 Industry 2-14

2.3.5 Nonprofit Organizations 2-14

2.3.6 Institutional Funds 2-15

2.3.7 Question 2: Foreign Funds 2-18


2.4 Questions 3 and 4: Medical School R&D and Clinical Trials 2-19


2.4.1 Medical School R&D 2-19

2.4.2 Clinical Trials 2-20


2.5 Question 5: R&D Expenditures by Type of Agreement 2-21


2.5.1 Question Clarity 2-22


2.6 Question 6: Character of Work 2-22


2.6.1 General Comments 2-22

2.6.2 Reaction to the Definitions 2-23

2.6.3 Response Approach and Burden 2-24

2.6.4 Reaction to Publication of Expenditures by Character of Work 2-26

Contents (continued)

Section Page


2.7 Questions 7 and 8: Funds Received as a Subrecipient and Pass-Through Funds 2-26

2.8 Question 9: R&D Expenditures From Federal Sources,
by Field 2-27


2.8.1 General Comments 2-28

2.8.2 Question Clarity 2-28

2.8.3 Availability of Non-S&E Data 2-28

2.8.4 Availability of Data for Prorating Expenditures for Projects Spanning Multiple Fields 2-29

2.8.5 Respondent Burden 2-30

2.8.6 Format of Question 2-31

2.8.7 Other Comments 2-32


2.9 Question 10: R&D Expenditures From “Other” Federal Agencies 2-32


2.9.1 Availability of Data and Burden 2-32

2.9.2 Response Approach 2-36

2.9.3 Suggestions and Comments 2-36


2.10 Question 11: Federal Inter/Multidisciplinary R&D 2-36


2.10.1 Question Clarity 2-37

2.10.2 Data Availability and Response Burden 2-38

2.10.3 Reporting Inter/Multidisciplinary at Level of Broad Fields vs. Subfields 2-39

2.10.4 Estimated Percentage of R&D That Involves Multiple Fields 2-39

2.10.5 General Comments and Concerns 2-40


2.11. Question 12: R&D Expenditures From Nonfederal Sources by Field 2-41

2.12 Question 13: Nonfederal Inter/Multidisciplinary R&D 2-43

2.13 Question 14: Cost Elements 2-44


2.13.1 Salaries and Wages 2-45

2.13.2 Fringe Benefits 2-46

2.13.3 Software 2-46

2.13.4 Equipment, Pass-Throughs, and Other Direct Costs 2-46

2.13.5 Indirect Costs 2-47


2.14 Question 15: Capitalization Threshold 2-47

2.15 Question 16: R&D Equipment 2-47

Contents (continued)

Section Page


2.16 Questions 17 Through 19: R&D Personnel 2-48


2.16.1 Question Completion and General Comments 2-48

2.16.2 Identifying R&D Personnel 2-48

2.16.3 Providing Data for R&D FTEs 2-49

2.16.4 Defining Faculty and Non-Faculty R&D Personnel 2-50

2.16.5 Question 18: Head Count 2-51

2.16.6 Question 19: Postdoctoral Researchers 2-51

2.16.7 Alternate Questions 17 and 18 2-52


2.17 Questions 20, 21, and 22: Proposals and Awards 2-53

2.18 Intellectual Property and Commercialization 2-54


2.18.1 Question 1: Intellectual Property Commercialization Transactions 2-54

2.18.2 Question 2: Patent Applications and Patents Issued 2-56

2.18.3 Question 3: Patent Filings Resulting from Federally Funded R&D 2-56

2.18.4 Question 4: Start-Up Companies 2-57

2.18.5 Question 5: Detailed Information on Start-Up Companies 2-58

2.18.6 Question 6: Income From Intellectual Property Transactions 2-58

2.18.7 Question 7: U.S. Patents Filed and Issued, by Field of R&D or Department 2-59

2.18.8 Other Comments and Observations About IP 2-61


2.19 Multiple Campus Institutions 2-62

2.20 Preferences for Notification of Survey Changes 2-63



Table


1 Codes from the Catalog of Federal Domestic Assistance (CFDA) 2-34


Appendix


A Questionnaires and Protocols, Version 1 A-1


B Questionnaires and Protocols, Version 2 B-1


C Questionnaires and Protocols, Version 3 C-1

This page is intentionally left blank.


Introduction

1



The Division of Science Resources Statistics (SRS) of the National Science Foundation (NSF) has a legislative mandate to


provide a central clearinghouse for the collection, interpretation, and analysis of data on scientific and engineering resources, and to provide a source of information for policy formulation by other agencies of the Federal Government...


In fulfilling this mandate, SRS administers surveys concerning U.S. science and engineering (S&E) education and human resources and research and development (R&D).



1.1 The Redesign Project

SRS has undertaken an evaluation and redesign of many of the surveys the Division conducts. The Survey of Research and Development Expenditures at Universities and Colleges (Academic R&D Expenditures Survey) addresses a major sector of the national R&D enterprise, collecting information from R&D-performing colleges and universities. The survey includes items concerning R&D expenditures by sources of funding, total and federally funded R&D expenditures by S&E field, character of work (basic research vs. applied research and development), total and federally funded R&D expenditures passed through to subrecipients or received as subrecipients, total and federally funded R&D expenditures in non-S&E fields, and federally funded expenditures by agency and field of S&E.


The goals of the redesign of the Academic R&D Expenditures Survey are to evaluate the survey content in light of changes in the academic R&D enterprise (e.g., interdisciplinary research and collaboration among industry, government, and universities), evaluate and refine the data collection approach, evaluate and revise the statistical methodology, and develop and test the revised survey. The redesign effort began with several initial information-gathering and assessment components: a data users’ workshop, a review of relevant literature, a synthesis of cognitive research conducted during the prior five years, consultation with a panel of experts, and an evaluation of the survey methodology. Following those initial activities, a series of site visits was conducted in January through March 2008 for the purpose of exploring redesign issues that emerged in the early information-gathering efforts and associated survey response issues. The survey questionnaire was revised following those site visits.


During December 2008 through February 2009, site visits were conducted at 17 institutions for the purpose of conducting cognitive interviews with survey respondents to evaluate the changes to the questionnaire and a possible new module on intellectual property and commercialization. Institutions were selected in such a way as to provide variation in a number of characteristics: level of R&D expenditures, public/private control, presence of a medical school, single versus multiple campuses, and geographical location were important considerations. In addition, efforts were made to include at least two historically black colleges or universities (HBCUs) among the site visit institutions. Geographic clustering was used to maximize the efficiency of the site visit trips and help limit cost; four clusters of institutions were visited, as well as one institution that could be visited in a one-day trip from the Washington, D.C., area.


The selected institutions were contacted and asked to participate in the site visits. About two to three weeks prior to the visit, each institution was sent three documents: the current survey instrument, the revised survey instrument, and a list of changes and additions. Westat staff worked with the survey respondent at each institution to establish an agenda for the visit based on the availability of appropriate university personnel.



1.2 Site Visit Approach

Two site visit protocols were developed by staff of Westat and NSF: a questionnaire protocol and a protocol exploring draft questions about intellectual property and commercialization.


During each visit, the protocols were used to ascertain reactions to the instrument changes and additions.


  • The questionnaire protocol “walked through” the questionnaire and included questions about respondent reactions to the items and instructions, the availability of data, and the process involved in responding to the items. In addition, questions were asked about multi-campus reporting and respondent preferences for how they are notified about questionnaire changes.

  • The intellectual property (IP) protocol contained similar types of questions about definitions, instructions, data availability, and response process for draft items about intellectual property and commercialization.

The protocols were administered according to the availability of university staff. As a result, the order of questions addressed in the site visits varied by institution. Because limited time was available at some institutions, it was necessary in some cases to focus only on significant changes or additions rather than reviewing each survey question. At four institutions persons responsible for IP could not be interviewed due to time constraints for the site visit or their own lack of availability on the day of the visit.


Some changes were made between groups of visits as issues with the survey instructions and items were identified. Appendix A contains the questionnaires and protocols used in the first group of visits, Appendix B contains the materials from the second group of visits, and Appendix C contains the materials for the remaining visits.1


This report summarizes the information obtained in 17 site visits with research-performing higher education institutions.

This page is intentionally left blank.


Findings

2



This chapter reviews the findings of each section of the questionnaire and draft questions concerning intellectual property and commercialization, as well as some additional issues addressed in the site visits. We begin with the Higher Education Research and Development Survey (HERD) questionnaire, including instructions and each survey item, followed by a discussion of intellectual property and commercialization. Following the discussion of questionnaire elements, other topics discussed in the site visits, i.e., multi-campus reporting and preferences for being notified about survey changes, are presented.


2.1 Cover Page

One version of the cover page was used for all the 17 site visits. Of the 17 sites, only 7 provided comments on the cover page. When asked about the cover page, many participants spontaneously made comments about other issues and did not return their attention to the cover page. In general, there was a positive response to the redesigned cover page, but the participants did not express strong feelings about it.


Of the seven sites that provided comments on the cover page, two said that the cover had a “less cluttered” look and that the new design was an improvement. Another respondent said that it was “straightforward and easy to understand.” Others had generic comments such as “it’s fine,” or “it’s okay.” One commented that the redesigned cover page was “similar” to the previous version.


One participant asked to be shown the language that stated the survey was voluntary. This information is on the cover page, and the respondent said simply that he “had missed it.”


Another respondent said that it was good to have the contact person and a different respondent said that they “liked having the acronym.”


2.2 Survey Definitions and Instructions

Page 2 of the questionnaire included the survey definitions and instructions. One part of the survey definitions and instructions on page 2 changed slightly after the first round of cognitive interviews. Throughout this discussion of the survey definitions and instructions, the version that the respondents saw is explicitly stated. Most often, there is only one version, and this is also stated.



2.2.1 Definition of R&D

One version of the definition of R&D was shown to all 17 sites. This version is displayed below:


Research and development (R&D)

includes “organized research” as defined by 2 CFR 220 (OMB Circular A-21). Please include all R&D activities of an institution that are separately budgeted and accounted for (see definition below). R&D includes both “sponsored research” activities (sponsored by Federal and non-Federal agencies and organizations) and “university research” (separately budgeted under an internal application of institutional funds).

Research

is the systematic study directed toward fuller knowledge or understanding of the subject studied. Research is classified as either basic or applied, according to the objectives of the investigator.

Basic research – is research directed toward an increase of knowledge; it is research where the primary aim of the investigator is a fuller knowledge or understanding of the subject under study rather than a specific application thereof.

Applied research – is research conducted to gain the knowledge or understanding to meet a specific, recognized need.

Development

is the systematic use of the knowledge or understanding gained from research, directed toward the production of useful materials, devices, systems, or methods, including design and development of prototypes and processes.


Of the 17 sites that saw the above definition of R&D, 3 did not comment on any part of it. The reactions of the 14 that did comment on some part of the definition are described below.



2.2.2 Reference to the OMB Circular A-21

Three of the 14 institutions commented on the reference to OMB Circular A-21. One participant questioned whether the definition of basic, applied, and development came from OMB. NSF answered this query.

One participant said that the reference to OMB Circular A-21 made sense for them. They apply the definition in OMB A-21 to their accounting practices. On the other hand, they said that “gift funds” did not make sense with reference to OMB A-21. Under A-21, gift funds would not go under organized research.


A respondent for a university system commented that “organized research” as defined by OMB A-21 versus departmental research was something that has always given them trouble. They said that A-21 definitions do not apply well to everything, and the A-21 definitions are vague in a number of respects. (OMB A-21 specifically classifies departmental research as instruction.)




2.2.3 Separately Budgeted Research

Thirteen of the 17 sites commented on “separately budgeted research.” Nine of the institutions expressed an accurate understanding of what separately budgeted research includes. Two of the nine said that they can report for any research that is separately budgeted (sponsored or university research). The other seven did not comment on their reporting capability but simply indicated their accurate understanding of the term.


Four of the 13 institutions expressed some problems with their ability to report using the survey definition of separately budgeted research.


  • One institution said that traditionally, they included both sponsored research and “independent research” (i.e., departmental research) in what they reported to NSF. (They understood from the discussion that this was probably misreporting.)

  • Another institution said that their departmental research is substantial, but it is not separately budgeted. Even though it is not “separately budgeted” they can nevertheless identify it. To not include it would entail a 20 percent underestimate of research expenditures.

  • Another participant said that they have “research slush funds” that are not huge, but are significant. At present, they are including these as research expenditures, even though they are not “separately budgeted.”

  • Another participant said that they have departmental funds that are used to support research functions. They called these funds “ICR funds.” The ICR funds can go to support a wide range of research activities: a new initiative, general support, etc. This site maintained that the definition suggested that funds not separately budgeted be included—and they requested clarification.

2.2.4 Distinguishing Between Basic Research, Applied Research, and Development

Eight of the 17 sites did not mention the definition of basic, applied, and development when they were asked to look at page 2. A few of these eight discussed the definitions when they reached question 6, which asks for the breakdown on basic, applied, and development. Nine of the 17 commented on the definition of basic, applied, and development when they looked at page 2.


One of the nine said that they collect information on basic and applied research at the award stage. The principal investigator (PI) completes an awards form and checks whether the award is for basic or applied research. They cautioned that the PIs complete the form with varying levels of precision and some simply leave this field blank. They also said that “development” was not on the form and they would have to modify the form to include development.


The other eight of the nine who commented expressed issues with reporting basic research, applied research, or development.


  • One institution said that their systems do not distinguish between applied and basic research and when asked, they simply report 100 percent basic because of their philosophical position that universities do basic research.

  • Another participant said that the distinction between basic and applied is inherently fuzzy and not particularly useful. “Applied research is appropriate for anything that is pure engineering, but fuzzy for anything else.” “The distinctions are not consistent even within one campus office. What someone will call basic another will call applied. People have a good sense at the extreme ends of the spectrum, but lines between the two are very fuzzy.”

  • Another participant took a similar stance and said that the distinction between basic and applied is highly arbitrary and quoted a professor who said that at his university, the distinction between basic and applied depends on what building you are in. This site said that they were not able to provide this information.

  • Another institution said that they cannot distinguish between basic and applied research. This is not part of their record-keeping and any reporting would be entirely based on surmise. Their experience with having PIs classify their research has not been good, as PIs are not concerned with the accuracy of their classifications. This respondent suggested that all research funded by government be considered basic and everything funded by private organizations be applied. This would make it “simple.”

  • Another participant said that the distinction between basic and applied research is not something the accounting system provides and an accountant is not qualified to make this determination. They also felt that there was a good bit of subjectivity in this determination.

  • Another institution said that they do not have a way to track basic and applied research. Their pre-award system makes no distinction at present. Furthermore, “development” is a term that they do not associate with research; development means to “go raise money.”

  • Another participant said that they guess at whether something is basic, applied, or development. They assume that contracts are more likely to be applied and grants more likely to be basic. This site said that they would need the PI to classify the research and at present there is no attribute in their system for this classification.

  • Another participant expressed increased confusion about how to classify their research after reading the definitions. Traditionally, the institution has always classified their research as basic, but after reading the definitions, they think their research is applied. They further explained that their research always has a goal of affecting some sort of change, which would align more closely to applied.


Two different wordings of the “separately budgeted” definitions and instructions were used for the site visits. We showed participants at the first four sites the initial wording:


Separately budgeted R&D

includes all funds expended for activities specifically organized to produce research outcomes and commissioned by an agency either external to the institution or separately budgeted by an organizational unit within the institution.

Current fund expenditures

are expenditures of funds available for current operations. Such expenditures include, among others, all those funded from unrestricted gifts and restricted current funds to the extent that such funds were expended for current operating purposes.


Of the four sites that commented on the original version, two said that “current fund expenditures” was a term that they did not use and had trouble interpreting.


One participant said that “current fund” has no “direct meaning” and that they assumed it meant “operating accounts or operating dollars.” Another suggested using the phrase “exclude capital projects” as the best way to express “current funds.” They expanded on this by adding that no one used the term “current funds” anymore, and that the term in use is “operating funds.”


To accommodate these comments and suggestions from the first round of interviews, the definition was revised and the remaining 13 sites responded to the version below:


Separately budgeted R&D

includes all funds expended for activities specifically organized to produce research outcomes and commissioned by an agency either external to the institution or separately budgeted by an organizational unit within the institution. Such expenditures include, among others, all those funded from unrestricted gifts and restricted current funds to the extent that such funds were expended for current operating purposes. Exclude capital projects.


In the revision, the separate definition of “current funds” was omitted and “exclude capital projects” was added.


Of the 13 sites that were shown the revised version, 4 made no comment on this part of page 2 at all. Of the 9 that did comment, one asked the reason that capital projects were excluded. One said that they “never” have any capital projects because they rented all their facilities. Another said that they had no capital projects “now.” Two commented that capital projects were “not research.” And the other four said that they understood that capital projects were not to be included or that they were not including capital projects now.



2.2.5 Inclusion of Non-Science and Engineering Fields (Box at Bottom of Page)

The lower box on page 2 stating the inclusion of all R&D fields was presented to all sites in one form. All 17 sites saw the instructions below:


Change in reporting: All Fields of R&D now included in all survey items

Please note that this revised questionnaire includes all fields of R&D in all survey items. Responses to all survey items should include R&D within science and engineering fields as in the past, and also should include R&D within all other fields such as humanities, education, law, and the arts. See question 9 for a complete listing of all fields of R&D.


Of the total sites, 2 did not comment on the box at all. Of the 15 sites that did comment on the box, 12 said that they could provide the information requested, that is, R&D expenditures for non-science and engineering fields. Three participants commented on the box by simply saying that they understood that the box was telling them to include non-science and engineering fields, but they did not say whether they could do this or whether it would be easy or difficult.


The 12 that did say that they could provide data on non-science and engineering fields thought it was relatively simple. One site commented that it was more work to pull out the non-science and engineering R&D expenditures than to include them. Another site said that they could easily produce the requested data, but it would entail some reprogramming.


Of the 12 that said they could provide the data on non-science and engineering fields, 8 said that this could be “very easily done” or that there was no additional burden involved. No institutions stated that they could not provide data on non-S&E fields.



2.2.6 Please Include … Please Do Not Include …

The instructions on page 3 guide the respondent on the R&D expenditures, types of grants and projects, and institutional components to include or exclude in their survey reporting. Two versions of these instructions were shown to respondents at the site visits. The instructions were revised after the first set of site visits, so the first set of universities commented on the first version of the instructions and the remaining 13 universities commented on the revised version. The particular instruction that was changed is discussed under Clinical Trials below.



Clinical Trials


The wording of the first version of the questionnaire instructed respondents to exclude:

  • Phase IV clinical trial expenditures


The wording of the revised version instructed respondents to exclude:


  • Phase IV clinical trial expenditures

(studies done after the drug or treatment has been marketed–see question 4 for a full definition)


This change was made because three of the four universities in the first set said that they did not know what a phase IV clinical trial was. Two of these first four universities did not conduct clinical trials, but at Question 4 one said that they did not know what a phase IV clinical trial was. The other two universities in the first set conducted clinical trials but were unable to identify the phase of the clinical trial from their databases. They too said that they were unsure what a phase IV clinical trial was. The two universities that conducted clinical trials said that the phase of the clinical trial was probably in the IRB database and that to report on the phase, they would need to query the IRB. One university said that it was very burdensome to discriminate the clinical trial by phase since it would require a completely manual examination of IRB or departmental records.


Most of the discussion about clinical trials occurred at Question 4. The discussion that emerged at Question 4 was sufficient to motivate the revision of the instructions on page 3.


Of the 13 sites that saw the revised instruction, 7 said that they did not conduct clinical trials, so this instruction did not apply to them. Of the 6 sites that did conduct clinical trials, 3 did not comment on the revised instructions on page 3.


The three sites that did conduct clinical trials and did comment on the revised instructions said that the phase of the clinical trial is not identified in their accounting databases. The phase of the clinical trial is recorded at the department level or in the IRB database. Only one of the three further commented that they can not identify clinical trials in their accounting system since they have no attribute that identifies clinical trials.



Demonstration Projects


Of the 17 sites, 12 commented on the instruction concerning demonstration projects when reviewing the instructions on page 3. Of these 12, 7 said that they did not know what a demonstration project was. One wondered whether a demonstration project was applicable to a university.


Of the 12 sites that mentioned demonstration projects, 5 expressed some concern about their ability to report demonstration projects correctly. Three of the 5 sites said that they code demonstration projects as research; 1 of the 5 said that they code them as public service; and another 1 of the 5 simply confirmed their understanding of what a demonstration project is. Examples of the comments made by the 5 sites that did mention demonstration projects are found below:


  • One commented that they did not have a specific classification for demonstration projects and that demonstration projects were not readily identifiable in their accounts. This site commented further that a demonstration project was “applied research.”

  • Another site commented that their agricultural school has demonstration projects, but they consider these to be “public service” since the public is the target of the demonstration project.

  • Another site confirmed that their understanding of demonstration project was consistent with survey intent, i.e., that a demonstration project shows how something theoretical works in practice and gave the example of the odorless swine farm.

  • Another site said that they conduct demonstration projects, but they code them as research.

  • The fifth site said that they cannot distinguish demonstration projects. If a demonstration project has been coded as research, then they cannot break it out [to identify it as a demonstration project]. Their demonstration projects are reported as research.



Notes to the Financial Statement


Of the 17 sites, 10 did not mention notes to their financial statements when reviewing the instructions on page 3. The 7 sites that mentioned the notes when reviewing the instructions on page 3 confirmed an understanding of “notes to the financial statement” that was consistent with survey intent. There was no misunderstanding or confusion about what the notes were or what was contained in them. The instruction about notes to the financial statement was interpreted in a straightforward and consistent manner.



K Awards and T32 Grants


Of the 17 sites, 7 did not mention the K Awards or the T32 Grants. Of the 10 sites that did mention the K Awards and the T32 Grants, 6 said that they coded the K Awards and the T32 Grants as research. On the other hand, 3 sites said that they did not code the K Awards or the T32 Grants as research, but as instruction or training. One site simply made the comment that the research training grants (top box, left column) should appear across from the non-research training grants (top box, right column).


The three sites that did not code the K Awards or the T32 Grants as research gave the following comments:


  • One site commented that they would be “more careful” in the future with their coding of research training grants. At present, they coded K Awards or T32 Grants as instruction or academic development for the students. They will code these as research in the future.

  • Another site disagreed with the instruction and said that they considered T32 to be instruction/training and that they should not be included in their report of research expenditures.

  • Another site said that there was “a potential problem” with the research training grants as opposed to other training grants. In their system, once a grant has been classified as a training grant, it is no longer clear or obvious whether it has a research component. This site can report such grants only as training grants or as research grants. This university’s system for coding grants did not align well with the requested survey categories.



Institutional Components


Five of the 17 sites reported foundations. None of the five foundations conducted research. Two sites reported that their foundations were for fund-raising purposes only. One site reported a foundation whose only function is to hold IP generated by the university. One site reported a foundation that was an agricultural center but did not conduct research.


One site reported a variety of different institutional components: an aquaculture station whose only function is to keep out non-indigenous fish, a venture capital organization, and a power company. All of these institutional components were separated from the university by many organizational layers, so none of these activities were eligible for reporting in HERDS.



Federally Funded R&D Centers (FFRDCs)


Five of the 17 sites commented on the instruction concerning FFRDCs when examining page 3 of the questionnaire. Two institutions actually had FFRDCs and they understood that these were separately budgeted and that the FFRDCs were not to be reported along with the university in their survey responses.


These sites commented that:


FFRDCs have a separate funding – there are 38 of these separately funded research labs.

[We] currently exclude FFRDCs.” This site further explained that any expenditure that is not a part of their financial system is excluded. An FFRDC would not be a part of the university’s system.


Three sites expressed some confusion about what an FFRDC actually is. One site commented that an FFRDC is not a commonly understood concept and that an FFRDC is not just any laboratory with Federal money. This respondent went on to say that there are only 37 FFRDCs and “if you had one, you would know it.” Another site that reported an agricultural center was confused about what an FFRDC was. They thought it might be the same as their state-funded “centers of excellence for R&D.” A third site wondered whether to report a research center on their campus sponsored by NSF as an FFRDC. They further commented that a list of the FFRDCs would be helpful to them in making this determination.



Research Equipment Purchased from Current Fund Accounts


Four of the sites mentioned research equipment purchased from current fund accounts when reviewing page 3.


  • One site asked what “research equipment purchased from ‘current fund’ accounts” meant. They had no idea what this phrase referred to. NSF answered this query.

  • One site paraphrased “research equipment purchased from current fund accounts” as including externally funded sponsored projects and equipment that had been purchased with those funds. They further explained that they have an equipment code for equipment that is not capitalized.

  • Another site said that research equipment is already considered part of the project, so the instruction seemed superfluous to this respondent.

  • Another site said that they had just purchased equipment for research and they understood that this was to be included.



Departmental Research


Three of the 17 sites mentioned departmental research when reviewing the instructions on page 3. These universities expressed concern about the instructions not to include departmental research expenditures. Two of the sites said that they considered departmental research a significant component of overall research and development. The third site simply stated that they can track departmental funds and the departmental funds do comprise a portion of their R&D, but did not indicate the size of their departmental research expenditures.


  • One site said that their departmental research expenditures were “substantial.” Departmental research is not separately budgeted for, but can be identified in their accounting system. Departmental research allows the university a high level of flexibility in allocating research funds based on needs and requirements (e.g., a broken piece of equipment or a lost grant). This university surmised that approximately 20 percent of all research and development expenditures were allocated as departmental research. Further, if the expenditures that went to graduate assistants and other laboratory costs were included, departmental research expenditures would comprise an even greater part of all R&D expenditures.

  • Another site said that they have what is similar to a research “slush fund” that is used for discretionary research expenditures. These research funds do not have a separate sponsor and they are not separately budgeted. These funds are not “huge” but they are significant.

  • The other site that discussed departmental research expenditures when reviewing the instructions on page 3 said that they can track departmental funds that are used to support research functions. These funds are used for a great variety of purposes: a new initiative, general support, etc. This university said that they can track these funds, but these funds are not separately budgeted and applied for. These funds are expended based on discretionary needs and requirements. This site requested more clarification about the reporting of departmental research expenditures.



Subawards and Pass-Throughs


Two of the 17 sites mentioned subawards and pass-throughs when reviewing the instructions on page 3. One site simply requested additional clarification regarding the reporting of subawards and pass-throughs and made no further comment.


The other site, a multi-university system, expressed considerable concern about their ability to report subawards and pass-throughs accurately. They said that their system does not support tracking funds passed through to subrecipient organizations. They know where the funds go, but they have no way of extracting data on whether the funds go to an educational institution or some other type of institution or organization. In the past, this university system has left this question blank, and will continue to leave it blank. Nor can they distinguish between a vendor, a subcontractor, and a subrecipient.



Public Service Grants


When reviewing the instructions on page 3, only one site asked for clarification about what a public service grant was. The staff participating in the interview were members of the accounting department and they were unfamiliar with this term.



2.3 Questions 1 and 2: Source of Funds and Foreign Funds

2.3.1 General Comments

Question 1 of the redesigned survey collects information similar to the current survey. The primary differences are the redesigned survey: 1) asks for both S&E and non-S&E expenditures, 2) moves the survey definitions to the question, and 3) requests data for additional subcategories of expenditures. The general reaction to the question was positive. All of the seventeen sites recognized that the question was asking for essentially the same information that was reported in the current survey.


On the other hand, a number of respondents raised concerns about specific requests. These concerns primarily relate to the additional subcategories that have been added. This is illustrated below in the discussion of particular components of the question.



2.3.2 Question 1: Three Bullets on Top of Page

There are three instructional bullets at the top of the page:


  • Include both direct and recovered indirect costs in rows a, b, c, d, and f.

  • Report the original source of funds, when possible. For example, if you received federal funds from another university, report that amount under “U.S. federal government.”

  • Include all fields of R&D: sciences, engineering, humanities, education, law, arts, etc. See full listing in Question 9.


All 16 respondents who commented on these instructions understood them. The primary comment referred to the second bullet on source of funds. Of the 12 who said something about this bullet, 1 said they could provide only the proximal source. Five of the 12 said they could only provide the original source if it was from the federal government, while the remaining 6 said they could provide the original source for both federal and nonfederal organizations.



2.3.3. Federal and State Funds

The two categories requesting expenditures of funds from federal and state sources remain essentially unchanged from the current survey. Respondents generally understood the instructions and expressed no difficulty in providing the information. The category for state and local funds includes a definition that makes explicit the types of funds to be included:


Any state, county, municipality, or other local government entity in the United States, including state health agencies. Include state funds that support R&D at agricultural and other experiment stations.


There was some concern that schools’ were using this category exclusively for their own state’s funds, rather than including funds from other state or local government. One of the seventeen respondents reported that previously they had been putting non-home state funds in the “other” category. However, in the future they will be able to conform with the above definition. Another school reported that they cannot separate out foreign government funds from domestic state and local funds. Otherwise, the other 15 schools reported being able to implement this definition without any additional work on their part.



2.3.4 Industry

The description of industry on the questionnaire is:


Domestic or foreign for-profit organizations. (Report funds from a company’s nonprofit foundation in row d.)


This instruction was understood by all respondents. Of the 15 respondents who addressed this question, 3 reported not being able to provide this information within their current system. The issue is that their system has a general code for “private” and does not distinguish between nonprofit and for-profit organizations (see discussion below). Of the remaining 12 respondents, 10 reported no issues with conforming to this definition within their current systems. The remaining two respondents reported difficulties with separating out domestic and foreign work (see discussion of Question 2 for more details).



2.3.5 Nonprofit Organizations

This category is new to the redesigned survey. The wording of this item evolved over the course of the site visits. Initially, the definition of non-profit was:


Nonprofit Organizations and donors

Nonprofit foundations and organizations; gifts from individuals restricted for research purposes.


During 4 initial site visits, all of the sites expressed confusion with the reference to “gifts.” According to these sites, gifts cannot qualify as research expenditures because they are unrestricted funds. One respondent said that anything that was a gift, but restricted for research, would probably be labeled a grant.


Based on this feedback, the questionnaire was changed by eliminating the reference to donors in the title and to gifts in the definition. For the remaining 13 site visits, the definition was:


Nonprofit Organizations

Nonprofit foundations and organizations.


Of the remaining 13 institutions, 5 said they could provide this information from their current information systems. The other eight institutions said they would have to develop a new system, either through reprogramming or by manually reviewing individual grants. Three of the institutions expressed significant concern that this would involve having to check on the 501c(3) status of the organization through an official source, like the IRS. One of these three expressed deep concern on the amount of burden that this would impose, saying that it would take the “…rest of the year to code.” The other institutions that did not currently code this information were not as outwardly concerned with the task but did say that they would need to develop some program or a manual methodology to identify these items.



2.3.6 Institutional Funds

Institutional funds are currently being collected on the survey. The redesigned survey proposes to collect more detailed information within this general category. In addition, the definitions of these categories have been placed adjacent to each category. The effect, at least during these interviews, seems to have been to bring increased attention to what is included in these categories.



Institutionally Financed Organized Research

The wording of this item evolved over the site visits. In the first round of four sites, this category was defined as:


Include direct expenditures allocated for separately-budgeted organized research. Include funds from unrestricted sources such as:

  • General-purpose state and local government appropriations

  • General-purpose awards from industry, foundations, etc.

  • Tuition and fees

  • Endowment income and unrestricted gifts

  • Other institutional funds, such as recovered indirect costs


This definition was taken, almost verbatim, from the instructions in the current survey. However, it clearly drew more attention during the site visits. In three of the four visits, respondents were confused by the term “unrestricted sources,” as well as the reference to the funds in the bulleted list. In most cases, respondents thought the question was asking for research that was funded directly from the referenced sources. Confusion occurred because under this interpretation, the research did not meet the A-21 criteria. For example, one site pointed to the reference to “tuition and fees” and expressed concern that they would even consider using this money as a direct source of research funds as defined by A-21. Similar confusion was expressed at the reference to “unrestricted gifts,” which again, by definition, is not allowable as research under A-21. Both of these seemingly conflicted with the idea that this was “separately budgeted organized research” as referenced in the definition above. A similar confusion was expressed by one respondent when pointing to the phrase “recovered indirect costs.” This particular respondent thought it conflicted with one of the bullets at the top of this page that asked for recovered indirect costs to be included in all of the other lines.


After the first round of visits, the wording of this item was modified to emphasize that this category of funds was meant to provide sources of institutional funds, not to actually link these funds to particular research projects.


Include direct expenditures separately-budgeted for organized research. Include expenditures funded from unrestricted sources such as:

  • General-purpose state and local government appropriations

  • General-purpose awards from industry, foundations, etc.

  • Tuition and fees

  • Endowment income and gifts

  • Other institutional funds, such as recovered indirect costs


The first and second sentences were reworded to more directly tie the monies to organized research. In addition, the second sentence inserted wording on expenditures linked to the funding.


This rewording did not seem to eliminate confusion around the purpose of the bulleted list. Respondents were still confused as to how to define institutional research. Of the eight institutions that provided feedback on this item, five expressed some confusion with how institutional research is defined. One site thought the bulleted list blurred the distinction between departmental research and organized research as defined by A-21. Another respondent expressed confusion as to why “industry” was included on the bulleted list given that it was already called out as a separate data item. Still another site was confused how the funds on the list could be considered as research expenditures.



Cost Sharing

Nine of the 17 institutions did report that they could provide cost sharing data in the form requested. Out of the 17 sites, 3 said they could not report the cost-sharing information. These institutions could not do so because cost sharing is not tracked in their system. The remaining five sites did not actually report whether or not they could report cost sharing.


Much of the discussion around this item focused on the definition included in the survey. There was some confusion about the meaning of cost sharing. Of the 17 institutions, 9 expressed some confusion about exactly what qualifies under the survey definition. The most common point of confusion was the distinction between mandatory and voluntary, which is called out in the definitions for this item. Respondents were more familiar with the distinction between mandatory, voluntary committed, and voluntary uncommitted. The first two are of interest to the survey and are distinguished by whether or not the bidders budgeted for the cost sharing because it was required by the grant or whether the organization volunteered the funds in the budget.


The discussion at several of these institutions suggested that information on cost sharing had not been previously included in the institutional funds category. For example, in three of the small institutions, the respondents reported needing significantly more effort to provide cost sharing information. To illustrate, one of these three said that to report this will require an additional step; they will have to pull four or five queries out of their data system and then massage the data from these queries. They reiterated that their effort to complete the survey is already significantly higher than the estimated 22 hours, and that this step would add significantly to their burden.



Unrecovered Indirect Costs

There was less ambiguity on the wording and definitions associated with unrecovered indirect costs. Eight of the institutions said that they currently calculate the costs as described in the item and that it would not be a problem to continue this process in the future. Three institutions said they could do the calculation, but it would require a significant amount of effort, and none of them had calculated unrecovered indirect costs in the past. The remaining six institutions did not specifically comment on this item.



Confidentiality

Respondents were asked how they felt about NSF publishing institutional cost data at an institutional level. The respondents were almost split down the middle. Of the 12 that addressed this issue, 5 reported they would be in favor of it, 4 reported being against it, and 3 did not feel they had the authority to speak for the university on the matter. Those in favor of releasing the information were of the opinion that most of the information on indirect costs is public information and there was little to hide from the public. There was also some curiosity about what others would report. The argument against releasing the information had to do with appearances and data quality. With respect to appearance, respondents were fearful the data on institutional costs could be misinterpreted by funders as showing ability to reduce indirect rates. From a data quality perspective, some respondents thought that the methods of calculating these numbers would not be consistent across organizations and it would be difficult to compare them.



2.3.7 Question 2: Foreign Funds

Question 2 is a new topic that is proposed for the survey. Five of the 17 institutions said that they currently code a foreign attribute in their system. Ten of the 17 reported that they do not currently do so, but would be able to code awards using the definition provided in the survey. Two of the 17 respondents did not directly answer how they would provide the data.


Of the five institutions that currently code this information, three use a definition that is not consistent with that specified on the survey. In two cases, the coding is done by the nationality of the parent company or the owner of the company. The third case was an institution with a code in their system that uses the location where the company is incorporated. It should be noted, however, that this institution does not currently have any funding from foreign sources. Consequently, their approach was more hypothetical than an application to actual funding sources.


The definition was a point of discussion for a number of the respondents. For several respondents, the definitions related to the amount of burden this item might place on them when providing a response. One institution felt that it was not appropriate to code according to the location since the parent company could be a better indicator of whether the funds were foreign. In order to investigate the lineage of a company, each grant folder would have to be examined and an investigation would have to be made into the ownership. Another institution said they have been trying to develop a code for some time, but they are having a hard time operationalizing a definition. As an example, they described a research agreement with a business in which they deal with a group in their state. However, they cannot figure out whether they are dealing with a subsidiary of the company or with the parent company located in a foreign country. The company has researchers in both locations. With respect to the HERD definition, they do not actually capture the location of the award.


Two institutions expressed significant concerns about the effort required to implement the definition. In all cases, these institutions implied that the effort might lead to fudging the information or simply leaving the field blank. One relatively small institution said they would “pull the information down and drill into it.” This institution does not have a large number of awards from foreign sources. While they do not have many of these types of awards, it would take a significant amount of time to track this information. A much larger research institution expressed concern that they would either have to change their accounting system or manually examine approximately 1,500 accounts. In the end, this individual said, “people would just do it in their heads.”



2.4 Questions 3 and 4: Medical School R&D and Clinical Trials

2.4.1 Medical School R&D

Ten of the 17 institutions had a medical school for which they would report expenditures on Question 3. Five of the institutions with a medical school indicated that they could respond to the question with little or no difficulty. One state university system with a dedicated medical campus said it would be “straightforward” to report R&D expenditures that took place on the medical campus, which awards degrees in medicine, dentistry, nursing, and pharmacy. However, respondents from this site noted it would be more difficult to separate out expenditures from specific components of the medical campus, such as those that award the MD or DO degrees.


Another institution, also part of a state university system, said that reporting expenditures from their medical school would be possible. This institution noted that their medical school is considered a “site” of the university system’s main medical school, which is located on another campus. Although the institution’s accounting system would allow separate reporting of R&D that took place on their campus, the respondents said that funds for research and salary flow from the main medical school. The respondents said that clear guidance would be needed to prevent duplicate reporting of R&D by both their institution and the university system’s main medical school.


When asked if they could separate medical school expenditures when R&D projects crossed into another department or school, respondents gave varying responses. A few institutions set up unique accounts for each PI that can be used to break out institutional units, but this appeared to be uncommon. Indeed, several institutions said that the instruction to report projects where the PI is based in the medical school was problematic. These institutions indicated that they have PIs whose research takes place in the medical school but who do not hold a medical school appointment. Excluding these projects would, in the respondents’ view, result in an undercount of medical school R&D activity. Furthermore, several institutions noted that they do not have the records needed to connect faculty appointments to R&D projects. Two of these institutions suggested removing the instruction to report expenditures for projects with PIs based in the medical school. Four institutions, including a large multi-campus state university system, would prefer to simply report the R&D that took place in the medical school, regardless of what school or department the PI was based in.



2.4.2 Clinical Trials

One institution noted that their medical school contains a school of nursing, a hospital, and other components. The respondents suggested adding a checklist of units that are typically part of medical schools or campuses. This would help users of the data make more equal comparisons between institutions.


Nine of the 10 institutions with medical schools also conduct clinical trials. Six of these institutions track clinical trial expenditures in their record systems and could respond to Question 4 with minimal effort. However, all but one of the six noted that they do not distinguish between clinical trial phases in their record systems. Several of these institutions said that Phase IV clinical trials would represent a very small expenditure amount, if Phase IV trials were conducted at all.


Three institutions have clinical trials but do not have sufficient data to answer Question 4. Respondents at one of these institutions said they planned to include clinical trials in their financial system at some point in the future, but they currently did not have an easy way of reporting them separately. The other two institutions do not have the capability of reporting clinical trials separately. One institution said that a complete “system change” would be required to update the general ledger to include clinical trials. The other institution could report privately financed clinical trials separately, but not those supported by federal funds.



2.5 Question 5: R&D Expenditures by Type of Agreement

Two different wordings of this question were used for the site visits. Participants at the first four sites were presented with the following initial wording:


Question 5. Of the total R&D expenditures that were externally funded (all sources other than Institutional funds), how much was received under each of the following types of agreements?


In the first round of cognitive interviews, where Question 5 was fielded in the form above, two of the sites reported that they could report R&D expenditures by grants and contract and two sites reported that they could not.


The two that could not report on grants vs. contracts said that this is simply not tracked in their systems and they could not provide the information. The two that said that they could provide the information said that grants vs. contracts is an attribute in their systems.


For the next and successive rounds of testing, question 5 was fielded in the following form. Note that the change entails the addition of explanatory parenthetical information.


Question 5. Of the total R&D expenditures that were externally funded (all sources other than the institutional funds reported in Question 1, row e4), how much was received under each of the following types of agreements?


Of the 13 sites that received Question 5 in the revised form, 8 said that they do not currently track whether an award is a grant or a contract and they cannot provide this information at present. Two of the eight sites said that they expected to be able to track whether an award is a grant or a contract in the future. They cautioned that these future systems are not in place, so they cannot comment on how well these systems will function.


  • One institution said that they used to have an attribute in their system for grant or contract, but that they removed it about six years ago because it was considered irrelevant.

  • Another respondent said that they consider grants, contracts, and “cooperative agreements” to be one and the same.

  • Another respondent said that they would report zero for contracts because they have no contracting authority. All research at this university has to be in the form of a grant.

  • A university system said that the distinction between a grant and a contract is inconsistently applied at the campus level. Contract terms are often added to grants, and a grant can develop into a contract. For federal government projects, there is a meaningful distinction to the prime recipient, but for subawards it is not meaningful.


Of the four sites that said they could provide the data on whether an award was a contract or a grant, the following comments emerged:


  • One site said that they can produce the data easily because their databases already include this attribute.

  • Another commented that they can identify contracts vs. grants and that this is easy to do. The indirect cost rates differ, so this is something that is important for them to differentiate.

  • Another site commented that providing the information was “possible” because this is something that they track.

  • The other site simply confirmed that this was an attribute in their system.


One site did not discuss Question 5 because of time constraints.



2.5.1 Question Clarity

There were no comprehension issues associated with Question 5. All respondents understood the difference between a grant and a contract, even when they felt the distinction was meaningless.



2.6 Question 6: Character of Work

2.6.1 General Comments

The character of work question requests R&D expenditures for basic research, applied research, and development, for both federal and total R&D expenditures. Question 6 represents an expansion of a question on the current questionnaire, which asks for the percentage of basic research for both federal and total expenditures.

Participants were first asked for their general reactions to the question. Participants as a whole did not express an interest in collecting data to make the distinction among basic research, applied research, and development work. They do not currently use information about basic vs. applied research internally, nor do they need to track the information to report elsewhere.


Expanding the question to include a third category—development—was met with mixed reactions. Only one institution seemed to have information on development available. The others said that they do not track it and/or questioned whether their institutions have contracts/grants that would qualify as development. At one institution that conducts clinical trials, there was some discussion about whether the participants would consider clinical trials to be development. The participants said the question text should explicitly mention clinical trials within the definition if that is what is meant by development.



2.6.2 Reaction to the Definitions

This section summarizes all participants’ comments about the definitions of basic research, applied research, and development. Note that approximately half of the participants discussed these definitions when they reviewed page 2 (see Section 2.2.4); the other half were asked to comment on the distinctions among the terms when reviewing Question 6.


When participants were asked to comment on the three definitions given within the item and the distinctions among basic research, applied research, and development, they pointed out that it is difficult to apply the definitions because research is often difficult to categorize cleanly.


  • One participant explained that at his institution, the basic vs. applied distinction varies among disciplines. For example, art and humanities would be more basic research, whereas contracts and grants for engineering and medical disciplines are typically more applied in nature.

  • Others said that the distinction is somewhat arbitrary or that their institutions are not interested in the distinction among these categories.


Several participants offered specific comments about the definitions.


  • Two participants mentioned terms used within the definitions that they considered to be problematic:

  • One participant wanted clarification of “recognized need” in the definition of applied research.

  • Another participant said that “systematic” within the definition of development is a “buzzword.”

  • Participants from one site said that after reading these definitions, they realized that what they have been reporting as basic research would actually qualify as applied research.


Two participants mentioned that they would need more information than is supplied.


  • One would like more instruction about how to classify contracts and grants into these separate line items.

  • One institution suggested adding concrete examples to go along with the definitions.

Several participants from private schools discussed their reactions to applied research in light of their unique situations.


  • Participants from one institution that conducts clinical trials asked if clinical trials would be moved to applied research.

  • Participants from one private school discussed their need to “stay away from industry research.” They equated industry-supported research with applied research. They said that state schools have a different mission statement that allows them to do applied research. They conduct what they consider to be fundamental research that is not limited in terms of ability to publish and the ability to include any qualified staff including foreign nationals. When asked, this respondent stated that fundamental research and basic research are the same thing. However, this respondent did question whether clinical trials, if included in the survey, would be basic or applied research.


2.6.3 Response Approach and Burden

Participants were also asked to describe how they would go about responding to the question. At most institutions, they were prompted to discuss both how they would identify the expenditures in each of these three categories and the level of effort involved to collect and report this information.


  • Three institutions reported that they currently track basic vs. applied research, so they have the basis for adding a category for development.

  • One institution uses an approach that NSF considers to be a best practice, where PIs provide this judgment on a pre-award form. The institution supplies the definitions of basic and applied research on the page of instructions for the form. This institution would simply add the additional definition to the form instructions and a response option for “development.”

  • The second institution tracks basic, applied, and development work for externally funded research, but not for institutional research. The PIs provide the information when contracts/grants are awarded; the data are coded into their Banner financial system.

  • At the third institution, staff in the finance office review each grant to determine whether it is basic or applied.

  • The rest of the participants said that they currently do not have the ability to track basic vs. applied research or do not have that attribute within their financial systems. These participants would not have the infrastructure in place to support the addition of the third category of development.

  • One participant strongly stated that for the upcoming HERD, it would be impossible to comply. Given the current economic crisis, resources are not available.

  • One institution said that responding to this question would require a lot of tracking; they would have to hire staff to do this work.

  • Participants from one institution said they would not be able to provide the three-part breakout because they did not think they would be able to get PIs to supply data at the project level.


In the course of their discussion, several participants described how they report the percentage of basic research on the current NSF R&D survey.


  • One institution said that they base their response on their awards database, not expenditures data.

  • One institution provides the same estimate each year that was inherited from the previous survey respondent.

  • Several institutions report all or nearly all basic research.

  • Several institutions are reluctant to estimate the percentage, so leave the field blank so that NSF will impute their value. One participant said that the NSF imputed figure sounded about right, based on knowledge of the types of research conducted at the institution.


2.6.4 Reaction to Publication of Expenditures by Character of Work

Participants were also asked to comment on whether the data about basic research, which are currently only reported at a summary level, should be kept confidential. These data are not published at the level of the individual institution since they are considered to be of a sensitive nature. Responses about continuing with the current approach vs. making the data publicly available were mixed; examples of participants’ reactions are given below.


  • Two mentioned that their concern about publishing the information was based more on a concern about the quality of the data than the sensitivity of the information.

  • A third said that they do not think reporting these data will be meaningful because the definitions are vague.

  • One institution said that since they are part of the federal government, disclosure is not an issue.

  • As expected, private institutions seemed to be more guarded in making this judgment. A participant from one private institution said “this is a senior management question.”

  • A state institution offered the response that publishing the data might lead to them being “pigeon-holed” as doing certain types of work rather than ranging over a broad spectrum.

  • Two mentioned an interest in seeing the data for other institutions, despite not having a strong interest in reporting the data for their own institutions.



2.7 Questions 7 and 8: Funds Received as a Subrecipient and Pass-Through Funds

Due to interview time constraints and the need to discuss new and significantly changed questions, Questions 7 and 8 were covered in only 8 of the 17 interviews. However, these interviews suggested that the questions would be largely nonproblematic for most respondents; an exception was one large multi-university system. Six institutions indicated they would be able to report on Question 7, funds received as a subrecipient. Respondents at all of these institutions indicated that they track both the original and proximal source of funds they receive as pass-through, making the question easy to complete. Only two institutions said that their records systems would not support reporting pass through funds received in this way. One of these institutions said that they do not track the original source of funds when they receive a subaward but thought the majority of such funding originated at the federal level. A large multi-university system noted that the sources of pass-through funds often do not tell them that they are a subrecipient, making it impossible to track the original source.


Four institutions indicated they could easily respond to Question 8. Several of these institutions noted that the question is similar to reports they already produce, and aligns with OMB Circular A-133 guidelines. All four institutions said that they separate vendor payments from R&D pass-through funds. On the other hand, three institutions reported that Question 8 was problematic for them. Respondents at one institution said that they do not code for R&D funds they pass through to other institutions, so reporting on Question 8 would be a “manual process” of reviewing their subawards. Another institution codes pass-through funds, but respondents said that vendor payments are not separated out. The third institution (the large multi-university system) said that monitoring of subrecipients is performed outside of the financial system, making it impossible for them to report on Question 8.


When asked about reporting pass-through funds, one additional institution said that they do not have a system in place to make subawards. Instead, this institution will make a new contract in which they act as the prime source to the subrecipient. A respondent at this institution estimated that “a third” of their external R&D funding is passed through in this manner (most of which is federal). It was not clear how or if the institution reports this information on the survey.


One institution suggested providing a link on the web survey to OMB Circular A-133 or including the relevant text from A-133 in the question instructions. A few institutions were asked if they would prefer to have a nonfederal column added to Questions 7 and 8. This did not seem to be a problem, and one institution commented that they would prefer this format. Another responded expressed a this preference at Question 16, discussed in section 2.15.



2.8 Question 9: R&D Expenditures From Federal Sources, by Field

The current R&D survey contains a question that collects expenditures for federal agencies by S&E fields of R&D. Prior to the site visits, the question was revised in two ways: (1) to include non-S&E fields in addition to S&E fields, and (2) to incorporate the corresponding discipline examples for each field and subfield on the same response page, rather than presenting the examples all together in a separate section of the survey.


During the site visits, interviewers noted whether the site visit participants noticed the multiple-page format. They asked participants to comment on the advantages and disadvantages of the format change and the availability of the data for the additional fields. Interviewers also asked for respondent reactions to the bulleted instructional information provided. Participants were also asked to discuss what they would do to report expenditures that crossed two or more fields and what level of effort would be involved in reporting for this question.



2.8.1 General Comments

One participant stated: “This is a good change. Including the non-S&E fields gives a better picture of what is going on in research.” Another participant made the positive comment of liking federal expenditures split out from nonfederal expenditures, which are covered in the new Question 12. One participant questioned the rationale for this question: “What is the benefit? Who will use this [data]?”



2.8.2 Question Clarity

All participants understood the question intent from prior years’ experience reporting for the current federal agencies expenditures question. Therefore, there were only the following few comments related to question clarity:

  • One participant expressed some confusion about the distinction between discipline and field and asked that these terms be clarified further. The participant was concerned that they may have reported incorrectly in the past. They use a manual process and make judgment calls about where to code expenditures.

  • One participant questioned the use of the word “examples” in the second bullet: “Is this optional to use these [lists of examples]?” She said she thought she had to put their departments into these listed categories.


2.8.3 Availability of Non-S&E Data

Most participants said they have the data to break out expenditures data by federal agency for the additional fields of non-S&E. In general, participants described reporting practices and databases that are in alignment with the question. One participant stated that Question 9 also reflects the A-133 reporting practices.


Two participants noted small issues with expanding to non-R&D fields:


  • One institution’s education grants are coded as public service, although some of them qualify as research. They would miss some of the non-S&E research in their total (for this question as well as Question 1).

  • One participant reported that many of the additional R&D fields do not exist at their institution; he seemed to be concerned about having many blank cells in the matrix.


2.8.4 Availability of Data for Prorating Expenditures for Projects Spanning Multiple Fields

The greater issue mentioned by approximately half of the participants was that they would have difficulty complying with the instruction in bullet 3 to prorate expenditures for projects that span more than 1 of the 36 subfields. The following are descriptions of the institutions’ contexts for responding to the question.


  • Institutions may allot all of an award to a single PI or distribute it among co-PIs in different departments. If captured in one department for one PI, however, an institution may still have the ability to provide breakouts across fields if they have the ability to assign multiple department codes when more than one is involved in a project.

  • Institutions’ financial systems may or may not explicitly identify and code projects that are linked (which would affect their ability to respond to Question 11).

Examples from the interviews follow.


  • One institution uses the approach of using the “other sciences” for research that they have a hard time classifying. If it is too difficult to prorate the research, the institution will put it all into “other sciences.”

  • One respondent noted that they have a lot of research that crosses education and engineering. They would only be able to report by the department of the PI, though.

  • One institution has no information to identify the grants that cross fields. They set up a project, e.g., archeology and anthropology, in one department code; they would not know that it was between departments. They would need a new database attribute to identify those with more than one field. Someone would have to tell them what percentage to allot to each field.

  • At another institution, “a grant might be distributed across departments without explicitly being identified as interdisciplinary.”

  • An institution does not prorate expenditures by department when more than one is involved in a project. They do have the ability to break them out, however. They assign multiple department codes when more than one is involved in a project.

  • When there are multiple PIs, one institution designates one PI as the main one assigns the grant to him/her. For co-PIs, they designate one as the main PI since their accounting system does not allow for multiple PIs.



2.8.5 Respondent Burden

For this question, participants mentioned burden of two types:


  • Burden to do the additional crosswalk for non-S&E fields, and

  • Burden to prorate expenditures for projects that cross multiple fields.


The additional steps of crosswalking department to the listed disciplines were not seen to be the burdensome part of responding to this question. For two institutions, no additional work would be required to report as asked for in this question; they already have data available for non-S&E by agency. Several said they would need to go through steps to develop a crosswalk from their departments to NSF’s fields for non-S&E, as they have done in the past for the S&E fields. For example, one participant stated that the institution has no systematic way to get the level of detail that the question requires. They manually match a department to each grant, but the university department differs from the grant department. Another thought the level of effort would not be great—“Integrating the arts, humanities, and social sciences is not a problem.” They have only limited research in these fields, so it would not be time-consuming.


The larger burden was expected to be associated with changing data collection and financial systems in order to prorate for projects involving multiple fields. One institution mentioned that they would not have to prorate; they always set up separate cost centers for projects that cross fields. However, the majority of institutions mentioned issues with allocating projects. Those that do not currently track linkages said they would need to start coding these occurrences at the proposal or award stage. Institutions that currently lump multidisciplinary awards in one department would need to change their coding systems. Participants made the following comments.


  • One participant stated that they would need a new attribute to identify those projects with more than one field. Then someone would need to tell them what percentage to distribute to each field.

  • At one institution, they could split the grant, but it would be difficult to do. They have a proposal form that proposal staff complete for the PI. Any change to the proposal form would require additional programming, which would be both financially and otherwise burdensome.

  • One participant explained that there would be considerable effort in trying to prorate the expenditures to multiple owning departments or fields. If the PI requests, there can be subaccounts, but this is at the discretion of the PIs.

  • One institution would have to review each project to identify which involved multiple departments, and then use the department codes to prorate.

  • Another participant said, “We have an account assigned to a specific field.” They assign unique account numbers to projects that are split (interdisciplinary).

  • One participant described their issue about prorating expenditures as follows: They do not have a way to capture multidisciplinary research in their system and they cannot prorate the expenditures across the different disciplines. If the award comes into one of their multidisciplinary centers, they can report that, but not otherwise.

  • Prorating was expected to be difficult for an institution that stated “we are not that fine-tuned unless it is very obvious.” The institution has accounts for projects that have joint PIs, but they do not prorate the costs. The account is established based on the lead PI and his/her department.

  • One participant said that prorating “would cause heartburn because people would be making decisions, which may or may not be the same.”



2.8.6 Format of Question

All but three of the participants said they preferred to have the examples of fields or departments on the same page. They liked not having to flip pages to find and view the disciplines within fields. One participant stated that this approach also lent greater understanding to the task.


One institution stated that the advantage of the new format is that the respondent does not have to look back and forth [between definitions and the question]. On the other hand, the new format makes the survey look longer than the current survey. Another participant pointed out that “the format [examples on the bottom of the page] is advantageous for new people” [new survey respondents].


Of the three participants who did not prefer the new format, two who were asked their format preference said that either version (current or new) was fine. They said they were more concerned about being able to supply the data for the question, not the format they used to report the information. One participant felt overwhelmed by the increased number of pages on the hard copy version, and suggested that the web survey version should provide a link that would open a frame.



2.8.7 Other Comments

Several participants had the following additional comments about Question 9.


  • The previous survey design had line numbers, which corresponded with the computer program output that one institution used to respond to the survey. They would like to see the line numbers.

  • One respondent commented that the note on HHS including NIH was helpful.

  • Another said that the agencies listed cover the majority of federal funding they receive.

  • One commented on liking the industrial management discipline under “Other engineering.”



2.9 Question 10: R&D Expenditures From “Other” Federal Agencies

Two versions of Question 10 were used over the course of the site visits. The first version offered space for six “Other” agencies. The revised version had space for 10 “Other” agencies, plus an additional line for additional “Other” agencies not included in the list of 10 or in Question 9.


Interviewers asked participants whether they would be able to respond and if so, how they would provide their data. They also asked for any suggestions for making the question easier to complete.



2.9.1 Availability of Data and Burden

Participants easily understood what the question was asking for and were mainly positive about being able to easily respond to it if added to the survey. Of the 16 sites where this question was discussed, 14 representatives said that they could provide the data after some level of additional work with their existing databases. Approximately half of these participants said that they have an assigned agency code for each project. Five participants mentioned that they could use the CFDA codes. (See Table 1 for a list of CFDA codes at the two-digit level.) However, they pointed out that some federal agencies do not have CFDA codes, so they would need to add some codes for completeness. Others mentioned in more general terms that the question corresponds with their record-keeping systems. Two institutions use their own set of internally developed codes, not CFDA codes, but could easily use these codes to respond to the question.


Participants from two institutions indicated that there would be some burden involved in providing a response to this question. One institution would not be able to provide the data. These participants mentioned the need to develop new queries. A participant from the other institution said “compiling these data would be a nightmare,” although the description of their required steps to retrieve the data sounded similar to what the other institutions described.


Table 1. Codes from the Catalog of Federal Domestic Assistance (CFDA)


Code

Agency/Program

10

Department of Agriculture

11

Department of Commerce

12

Department of Defense

84

Department of Education

93

Department of Health and Human Services

14

Department of Housing and Urban Development

16

Department of Justice

17

Department of Labor

19

Department of State

15

Department of the Interior

21

Department of the Treasury

20

Department of Transportation

64

Department of Veterans Affairs

90

Elections Assistance Commission

66

Environmental Protection Agency

23

Appalachian Regional Commission

27

Office of Personnel Management

29

Commission on Civil Rights

30

Equal Employment Opportunity Commission

32

Federal Communications Commission

33

Federal Maritime Commission

34

Federal Mediation and Conciliation Service

36

Federal Trade Commission

39

General Services Administration

40

Government Printing Office

42

Library of Congress

43

National Aeronautics and Space Administration

44

National Credit Union Administration

45

National Endowment for the Arts

45

National Endowment for the Humanities

45

Federal Council on the Arts and the Humanities

45

Institute of Museum and Library Services

46

National Labor Relations Board

47

National Science Foundation

57

Railroad Retirement Board

58

Securities and Exchange Commission

59

Small Business Administration

68

National Gallery of Art

70

Overseas Private Investment Corporation

77

Nuclear Regulatory Commission

78

Commodity Futures Trading Commission

81

Department of Energy

85

Harry Truman Scholarship Foundation

Table 1. Codes from the Catalog of Federal Domestic Assistance (CFDA)–continued


Code

Agency/Program

85

Christopher Columbus Fellowship Foundation

85

Barry Goldwater Scholarship and Excellence in Education Foundation

85

Woodrow Wilson International Center for Scholars

85

The Morris K Udall Scholarship and Excellence in National Environmental Policy Foundation

85

James Madison Memorial Fellowship Foundation

85

Smithsonian Institute Fellowship Foundation

86

Pension Benefit Guaranty Corporation

88

Architectural and Transportation Barriers Compliance Board

89

National Archives and Records Administration

90

Denali Commission

90

Delta Regional Authority

90

Japan US Friendship Commission

91

United States Institute of Peace

94

Corporation for National and Community Service

96

Social Security Administration

97

Department of Homeland Security

98

United States Agency for International Development



SOURCE: Catalog of Federal Domestic Assistance (www.cfda.gov).




2.9.2 Response Approach

Participants at four sites were asked to look at the format of the second version of the question and describe how they would report the requested information. Their response would depend on how they ran the query, since there was not a specific instruction about it. Three initially said that they might run a query to list by agency number or sort alphabetically. These participants reconsidered and decided that it might be best to sort based on descending order of funding and report on lines a – k in that order, leaving the smallest ones lumped together on the last line for “Other agencies not listed above or in Question 9.”



2.9.3 Suggestions and Comments

Participants’ suggestions for improving this question focused on the number of agencies to report, the instructions for how to supply the data, and how to present the item in the web version of the survey, as follows:


  • Number of agencies to report. Three of the sites that saw the first version (6 lines) said they would need more lines. One of the sites that saw the version with 10 lines said there was “not enough room for all others here.”

  • Instructions for reporting by agency. Several participants wondered about what level of agency/subagency to report for this question and noted that an explicit clarification of the level of federal agency to be reported would be helpful (e.g., department or subagency). Those who said they would use CFDA codes asked what they should do with agencies that did not have CFDA codes. A suggestion was to provide a list of agencies.

  • Instructions for completing the item. The participants who had discussed various ways to list agencies and expenditures suggested including an explicit instruction about how to order agencies in the list.

  • Format of the web-based item. Three participants asked or talked about how this question might appear on the web version. They suggested using a format such as a pick list or offering a drop-down box for each lettered row.


2.10 Question 11: Federal Inter/Multidisciplinary R&D

Two different wordings of this question were used for the site visits. The initial wording is given in the questionnaires in Appendixes A and B; the revised wording is shown in Appendix C. The initial wording, presented in the first eight site visits, was:


How much of the federal R&D expenditures reported in Question 9, row K, column h, was for interdisciplinary or multi-disciplinary projects? Report interdisciplinary or multi-disciplinary projects that involve two or more of the 36 fields of R&D in Question 9.


Prior to the third set of site visits, the question text was simplified and shortened by removing the terms “interdisciplinary” and “multi-disciplinary.” Participants at the remaining nine sites saw a version that read:


How much of the federal R&D expenditures reported in Question 9, row K, column h, was for projects that involve two or more of the 36 fields of R&D in Question 9?


Interviewers asked participants to address whether their records have the information required to respond to the question and what level of effort would be involved to prepare their response. Respondents were also asked about the ease of and preference for reporting research that involves two or more fields at the level of the 10 broad R&D fields or the 36 R&D subfields. At some sites, the participants provided an estimate of the percentage of their research that involves multiple fields, including any units or centers that conduct research across research fields. In the course of the discussions, participants also mentioned a variety of issues with this question; these concerns are also summarized below.



2.10.1 Question Clarity

Generally, participants across the sites did not seem to have any difficulty understanding what this question was asking for. For example, at one site where participants saw the shorter text version, they said they did not have any issues with the question intent or wording. Instead, for them, “our issue will be figuring them [projects that cross multiple fields] out.”


Participants did show signs of misunderstanding during a visit at a site where the longer text version was used. The participants suggested that the survey should be very specific about what is meant by “interdisciplinary.” They asked if a center grant would be considered interdisciplinary. These participants did not seem to notice the phrase “two or more of the 36 fields of R&D in question 9” within the question text. After that instruction was pointed out to them, they still did not understand how to answer this question.


One participant did suggest that to improve the question clarity, emphasis could be added by using the explicit instruction: “Include any projects that cross the 36 fields of R&D.”



2.10.2 Data Availability and Response Burden

When asked if their institutions have records to identify projects that cross two or more fields, nine of the 15 participants who directly addressed this question said that they do not have an attribute or code in their financial systems. Some of the variation across institutions is indicated by the following responses when asked about response approach and burden.


  • One institution does not have data that show whether two faculty from different departments are working on the same grant. To identify this, they would have to go across different data systems, across many grants (10,000 accounts) and filter all of it manually.

  • Participants from one institution said they do not have a code that indicates multidisciplinary research. They could identify funds that have multiple accounts, but that will not guarantee multidisciplinary; it just identifies projects with more than one PI.

  • One site captures expenditures in the department where the PI has his/her appointment. For projects that have multiple PIs, they designate one PI as the main one. Then the field is determined by the department of the main PI.


Two institutions split their awards to multiple accounts when multiple PIs are involved. However, they do not have an attribute that explicitly identifies and connects these awards in their financial systems. These two institutions felt that with some effort, they could derive the data for Question 11. For example, one of the institutions could identify projects that cross departments (identify in a data run which projects have more than one code), which would allow them to capture a minimum of research that is taking place in two or more departments.


Three institutions said they currently have the capability to assign multiple codes to awards for some or all projects. This did not necessarily mean that they are currently picking up all multiple department research for reporting. Their situations are as follows.


  • At one institution, it is up to the PIs to tell the financial office to code for associated departments or the additional codes will not be assigned. At this institution, they would need to inform PIs of the importance of assigning multiple codes so that the institution could accurately report for this question.

  • At the second institution, participants said they have the ability to assign multiple department codes when more than one is involved on a project, so they could report the amount with more than one department. However, there are only a few projects that involve multiple departments.

  • For the third institution, if one project has separate accounts for each discipline, they can be identified. However, it there is only one account for a project, special programs have to be written to identify them.

A last institution said they could easily report for this question as long as the work is done in their interdisciplinary centers, where they conduct only interdisciplinary work. The projects within the interdisciplinary centers are coded by choosing one of the NSF fields. However, the participants did not indicate whether they have noncenter projects that cross multiple fields and, if so, what steps they would have to take to report those projects in Question 11.



2.10.3 Reporting Inter/Multidisciplinary at Level of Broad Fields vs. Subfields

At approximately half of the sites, participants were asked to give their opinions on whether it would be easier or preferable to respond to the question by reporting at the level of the 36 subfields or 10 broad fields. At three sites, the response was that it would not matter. Since they do not currently have an attribute to capture such research, they would have to build in this capability and the level of effort would be the same. Three sites would prefer to report at the level of 10 broad fields. Two sites would prefer the 36 subfields, since they track at the level of the department now.



2.10.4 Estimated Percentage of R&D That Involves Multiple Fields

Four institutions were asked to estimate the amount of their research that crosses multiple fields. For two of the institutions, participants did not want to respond with a specific number:


  • Participants for one institution declined to estimate, saying that they did not have a good sense of the level of multidisciplinary activity.

  • Another institution’s participants replied that about half of the proposals have more than one PI but declined to guess at the proportion of all research that is multidisciplinary.


Participants for two other institutions were willing to estimate, as follows.


  • One participant said “high—40 to 60 percent.”

  • For the fourth institution, the respondent thought that less than 10 percent of research involved multiple fields (departments) now. However, the response was offered with a caveat: It is possible for them to have two accounts within the same department.


2.10.5 General Comments and Concerns

In the discussions of this item, participants recognized its increasing importance and prominence while also voicing a variety of cautions and concerns. As one participant stated, universities will want to report interdisciplinary research because it is considered a good thing and it is rewarded. However, concerns were related to developing a standardized definition that all survey respondents could understand and apply to their reporting, receiving some guidance in how to treat interdisciplinary/collaborative centers, and avoiding under- or overcounting due to differences in tracking projects and people within and across institutions.


The many cautionary comments include the following.


  • The question would likely yield wide variation in reporting; there is no uniform understanding of multidisciplinary and interdisciplinary research and how to deal with it.

  • Interdisciplinary” is a fluid concept in a constant state of flux. What is considered to be interdisciplinary today will not necessarily still be interdisciplinary tomorrow.

  • Participants from one institution stated that interdisciplinary is easier to define at the broader level, and that the HERD grouping does not reflect how disciplines naturally group themselves.

  • One institution has seen an increase in the number of subcontracts. The participant suspects that the work is of an interdisciplinary nature and the subcontractors provide a piece of the research not available at their own institution. They have no way of assigning a field or department to the subcontractor.

  • There is an assumption that research centers are set up to be interdisciplinary or collaborative. However, that is not the case at all institutions. “R&D projects with accounts in more than one department” would capture the majority.

  • Some interdisciplinary centers have their own department code. At those institutions, they could more easily identify projects to report for Question 11. Where centers do not have their own department code, however, projects within centers would still need to be reviewed to see if they qualify as crossing multiple fields or subfields.

  • One institution said that it is not feasible to use expenditure data to identify everyone working on an R&D project by their home department. They also mentioned the issue of how to deal projects involving faculty who have joint appointments (many faculty have joint appointments or multiple home departments).

  • One set of participants concluded the discussion of interdisciplinary R&D with a caution that people were going to find ways to produce the interdisciplinary data, but then what it actually meant would be difficult to determine.

  • At two sites, participants stated issues in terms of what they might miss or over report:

  • One institution pointed out that there was a risk of double counting. They also said that a question like this will be highly prone to recording error.

  • One site that breaks awards across PIs said they would not be able to catch all interdisciplinary work, so they would be providing an undercount.


2.11. Question 12: R&D Expenditures From Nonfederal Sources by Field

Two different wordings of this question were used for the site visits; at the first four sites the initial wording was as follows.


  1. What were your FY 2009 R&D expenditures for the nonfederal sources below in each field of R&D?

  • The total for each column in row K should match the corresponding sources reported in Question 1.

  • If an individual project involves more than one of the 36 fields of R&D, please prorate expenditures when possible and report the amount for each field involved. (Note: Question 13 asks for expenditures for interdisciplinary and multidisciplinary R&D regardless of whether you prorate the expenditures.)

R&D Fields


Nonfederal sources of funds


(a)

State and local government

(b)



Industry

(c)


Nonprofit orgs. and donors

(d)


Institutional

funds

(e)

Other nonfederal sources

(f)



Total 1



All four sites understood that this was a new question. All four sites said that they could provide the required data. Their individual comments follow:


  • One institution said that they could produce the requested data. They have the source of the funds in their database. They also said that it will take more time to link the source of funds to the R&D fields, or “departments.”

  • Another institution said that they could easily produce the requested data; it would not be a problem. The requested information is already in their system and all it requires is an additional query to their database.

  • Another respondent said that they could provide the requested data. They have the necessary information at the level required by Question 12. They did stress the additional burden and the additional steps to provide these data.

  • Another respondent said that they could provide the requested data, but they would have to use a crosswalk to get the expenditure data into the R&D fields. They too stressed the additional burden associated with providing this information. To provide this information, this site will have to reprogram their systems.


After the first round of cognitive testing, Question 12 was revised to include explanatory material under the heading for R&D Fields “(See Question 9, pp. 8-9).”



  1. What were your FY 2009 R&D expenditures for the nonfederal sources below in each field of R&D?

  • The total for each column in row K should match the corresponding sources reported in Question 1.

  • If an individual project involves more than one of the 36 fields of R&D, please prorate expenditures when possible and report the amount for each field involved. (Note: Question 13 asks for expenditures for interdisciplinary and multidisciplinary R&D regardless of whether you prorate the expenditures.)

R&D Fields

(See Question 9, pp. 8-9)

Nonfederal sources of funds


(a)

State and local government

(b)



Industry

(c)


Nonprofit orgs.

(d)


Institutional

funds

(e)

Other nonfederal sources

(f)



Total 1



In subsequent rounds of testing, all 13 sites were shown the revised Question 12 above. Three of the sites did not comment on Question 12 because of time constraints. Of the 10 sites that commented on the version of Question 12 above, 5 said that they could provide the data or provide the data with relative ease.


  • One site tempered “ease” with the caveat that identifying that reporting of multidisciplinary R&D was a problem.

  • Another site said that Question 12 was consistent with their data. Their only issue was cost sharing; they only track committed cost sharing.

  • Another site said that Question 12 was “straightforward” and “minimal work.” Institutional funds would be difficult for them to report. They did not think that the field examples needed to be repeated from Question 9.

  • Another site said that they had the requested information and it was not difficult to produce. This entailed no extra programming for them.

  • Another site simply said that they could provide these data. They requested that the same field classifications be used that are used in Question 9.


Other sites were less specific about the ease/difficulty they anticipated in providing the data, but made other types of comments:


  • One site requested automatic fills from Question 1.

  • Another site talked about having edit checks to ensure that the data corresponded with previous responses.

  • One site requested that the (a), (b), (c), … (f) be removed since these no longer refer to anything that comes earlier in the questionnaire.

  • Another site said that they could not report the original funding source, only the proximal source, but otherwise the requested data are in their databases.


Only one site reported confusion about what to report. This site said that Question 12 would be labor intensive and they were not sure what to prorate—salary, effort? This site also commented that reporting for nonprofits would be difficult. The respondent also said that when they report for state, they report only their own state. Other states would be very small amounts that they would report in “nonprofit.”



2.12 Question 13: Nonfederal Inter/Multidisciplinary R&D

The focus for Question 13 is nonfederal expenditures for interdisciplinary and multidisciplinary research. Question 13 is similar to Question 11 in that it asks survey respondents to indicate the expenditures for a segment of the total reported for the question that precedes it (nonfederal expenditures by agency). This question was simplified in a similar manner to Question 11 between the first two and second two sets of site visits.


The wording of the first version of the question was:


How much of the nonfederal R&D expenditures reported in Question 12, row k, column f, was for interdisciplinary or multi-disciplinary projects? Report interdisciplinary or multi-disciplinary projects that involve two or more of the 36 fields of R&D in Question 12.


The wording of the revised version was:


How much of the nonfederal R&D expenditures reported in Question 12, row k, column f, was for projects that involve two or more of the 36 fields of R&D in Question 12?


At many sites, Question 13 or was not discussed in detail due both to time constraints and its similarity to Question 11. At seven sites, participants were told that NSF is considering making changes to reflect new and emerging research fields. Site visit interviewers explained the two options for incorporating those changes into the HERD survey: (1) delay making changes to the survey until all the field changes are decided, or (2) make some changes when the survey redesign is introduced in 2010 and phase in additional changes later.


Of the seven participants, the responses were as follows.


  • Four participants would prefer to see all the changes at once, later. Reasons for this view included a preference to have the taxonomy finalized before being given notification, an opinion that “it would create a problem if the disciplines are significantly different from IPEDS,” and an opinion that if the changes will be labor intensive, it would be better to do them all at once. One institution would like the same codes as the Classification of Instructional Programs (CIP) codes, saying that this would make the reporting easier.

  • Two participants said that making gradual changes to the survey would be fine. These participants said that they look for changes when they start each cycle of the survey.

  • One participant said it did not matter whether the changes were made piecemeal or all at once.


2.13 Question 14: Cost Elements

Question 14 is a new item that asks respondents to list out the cost elements of the total expenditures reported in Question 1. The item is divided into direct costs, including salaries and wages, fringe benefits, software, equipment, pass-throughs to other organizations, and other direct costs. Recovered and unrecovered indirect costs are also requested. Some of this information is identical to requests in earlier parts of the questionnaire (e.g., pass-throughs; unrecovered indirect costs).


All but two institutions said they could report most of the cost elements with their existing record system. In particular, they could separate direct from indirect costs as well as focus on expenditures for research purposes. One of the two institutions said that to provide answers to these questions they would need to redo their entire database. However, independent of the NSF survey, they had already been planning for this change to their system. Consequently, they will be able to respond to this query within the next few years.


The other institution that could not provide these data said that their current accounting systems do not support reporting at the level of detail requested in this question. They have no way of “getting to those line items, in terms of the totals reported: in terms of what is salary, what is fringe, what is software.” This particular institution said this question will significantly increase their response burden because a set of reports would have to be run for each of the lines in Question 14. Extracting the data for each of these lines would significantly increase the level of effort for the survey.


In the remainder of this section, each of the individual items within this question are discussed. For purposes of simplicity, the two institutions that could not provide these data are excluded from the discussion.



2.13.1 Salaries and Wages

All respondents could provide this information. There was a question on how salaries paid through a cost sharing arrangement would be reported. Of the nine institutions that responded to this question, eight said that the work under the grant would be included in the salary and wages line. Institutions were asked where or whether salary for faculty who work on research during the academic year but only charge their time to a specific grant during the summer would be reported. Of the 3 institutions that addressed this question, it was explained that this would be captured in the general ledger as committed cost sharing and would also flow through the university’s payroll system. In general, respondents consistently reported that any type of committed cost sharing would be treated like this. The one respondent who would not report cost sharing in salary and wages said it would be included in the “other direct costs” line.


Two respondents discussed the treatment of endowed chairs. Each reported handling this in different ways. In one case the funds would show up as salaries, and in another case they would not.



2.13.2 Fringe Benefits

All of the institutions were able to separate out fringe benefits, with one exception, using the definition provided on the survey. The exception was how sick leave and vacation were counted. Of the 11 institutions that reported on sick leave and vacation, 9 said they were included in salary and wages. Only 2 said they could be reported as part of fringe benefits.



2.13.3 Software

This question asks respondents to report on both capitalized and noncapitalized software. Universally, respondents said they could provide the information on any software that has been capitalized. In some of these cases, respondents said that they typically do not capitalize these purchases, but they are set up to do so if the circumstances arise.


The ability to report on noncapitalized software is much different. Of the 14 institutions that addressed this question during the interview, 7 said they would be able to identify these costs. These seven have special codes in their system that identify software. The one type of purchase they could not identify was software that came preloaded with computers. The other seven institutions put software below the capitalization threshold in with a general category of supplies and could not identify this information without sifting through a large number of invoices.



2.13.4 Equipment, Pass-Throughs, and Other Direct Costs

These categories of expenses were handled in a relatively uniform way across all sites. All institutions could identify the expenses associated with research and report the expenditures. With respect to equipment, respondents were asked how they would handle lease-to-own and rentals. Not many locations had any lease-to-own expenses. Of the three respondents who discussed these expenses, two said they would capitalize the lease, while the other individual would put in the other direct cost category. A larger number of institutions did rent equipment. Of eight organizations that reported this, seven would put them in the “other direct costs” category.


For pass-throughs, several respondents were not clear how to handle vendor payments. There was a belief that vendors should be included in other direct costs. One respondent suggested adding an instruction to clarify what to do with vendor payments.



2.13.5 Indirect Costs

This item consists of two components—recovered and unrecovered indirect costs. Respondents’ ability to supply this information mirrored the discussions surrounding Question 1. Those organizations that could provide each of these expenses were also able to provide the information for this item.



2.14 Question 15: Capitalization Threshold

This question evolved over the course of the site visits. In the first several visits, respondents were asked for the dollar threshold and years of useful life for both software and equipment. The initial respondents said that there was probably no need to ask for the years, as it did not vary much for major purchases. In later rounds, the question was modified by deleting the request for years of useful life.


All respondents said they could provide this information.



2.15 Question 16: R&D Equipment

Question 16 was not addressed in 9 out of the 16 interviews due to time constraints and the similarity of the question to what is on the current survey. However, all seven institutions that were asked about Question 16 said that it would be possible for them to respond to it. Respondents largely indicated that they would report only capitalized equipment for Question 16. Only one institution indicated that they would include both capitalized and noncapitalized equipment. This institution suggested that the question specify the capitalization threshold to use if only capitalized equipment was to be reported. A number of institutions also indicated that the total for Question 16 should match row d of Question 14. When asked if a nonfederal column would be useful, one institution said that it could be, but the existing layout was fine as well.



2.16 Questions 17 Through 19: R&D Personnel

2.16.1 Question Completion and General Comments

When the revised survey questionnaire was transmitted to the participating institutions, they were asked to complete the personnel questions (Questions 17, 18, and 19) prior to the site visit, if possible. None of the 16 institutions completed the questions as requested. One institution did tabulate some data on staff head counts.


As a group, participants commented on their inability to respond or the level of effort that would be required to respond. Key findings include the following:


  • Two participants stated that they did not know how they could produce the full-time-equivalent (FTE) data and another stated that they would not offer to give FTE numbers.

  • Other participants focused on the differences from their current record-keeping practices. No universities reported that they have existing records that could provide the requested data without new programming, and two referred to looking person by person or grant by grant.

  • Four institutions expressed concern that the questions would elicit data that are inconsistent with their reports of faculty to IPEDS and/or suggested that NSF consider using the IPEDS definitions for collecting personnel data.


Complexities in counting personnel in general were cited by some respondents. The issues they raised included:


  • Budgeted positions in their human resources system may or may not be filled at a given point in time.

  • The arrival and departure of personnel over the course of a year would affect estimates.

  • Persons could move from full-time to part-time status (or vice versa) during the year. Dual appointments would cause some people to be double counted in more than one field.


2.16.2 Identifying R&D Personnel

Universities reported a number of measures that they would use to identify personnel who work on R&D activities. Most institutions (10 of the 16) cited existing records that include payments from R&D accounts, payroll systems, and effort reports. However, in general, they stated that their systems are not currently configured for this purpose and additional programming effort or manual processes would be required. Those respondents who discussed effort reporting as a means of identifying R&D personnel generally stated that they could identify committed cost sharing, but that they do not track uncommitted cost sharing.


Two institutions, one of which was a system office representing several large research universities, cited problems with identifying R&D personnel. In one case, the respondent stated that PIs are typically paid by departments and not by grants. This university also reported that they have some staff who are “zero appointments,” i.e., postdoctorates who are not paid salaries and therefore cannot be tracked by payment of salary from a research account. A respondent at another university noted that there could be PIs who are not tied to research accounts because they are paid with university funds.



2.16.3 Providing Data for R&D FTEs

Nearly all universities raised some issues with the reporting of FTEs. One respondent noted that FTE is “an elusive concept” in higher education. Several asked for a definition to be provided by NSF. We describe some of the concepts/terms that these respondents found problematic.


  • A recurring theme was that universities have staff with 9-month, 10-month, and 12-month appointments. At some universities, both 9-month and 12-month appointments are considered one FTE and the 9-month faculty member who also works during the summer is still considered one FTE. At other universities, a 9-month appointment is considered to be 0.75 FTE and summer work could bring this to one FTE.

  • Some universities lack an explicit definition of faculty FTE in terms of credits or hours. Others have variable definitions: one university reported that the FTE definition for faculty was expressed in terms of credit hours but varied by school within the university; another reported that the FTE rule is different for those who teach undergraduates and those who teach graduate students.

  • For nonfaculty personnel, available data varied for hourly, student, and exempt personnel. One university noted that someone could be classified as full time but work 30 hours per week. Another reported that the university had no FTE definition for nonfaculty. Another stated that they do not track hours of work for exempt personnel.

  • Two institutions stated that they could calculate a figure using the percent of salary paid from R&D (or cost shared), but this would be calculated without regard to whether the appointment was a 9-month or 12-month appointment.

  • One university warned that the R&D FTE would be an undercount of R&D effort and would not capture all the effort of either faculty or nonfaculty. A university system office representing several large research universities made a similar statement, noting that faculty are paid with university funds during the academic year and are paid through grants in the summer.

Other respondents offered comments that were unique (in that each comment was made by a single respondent):


  • The response process would require looking person by person.

  • The response process would require looking by fund numbers.

  • The process would involve “starting from scratch.”

  • The process would be “completely manual.”

Four others suggested a likelihood that the data could not be acquired. Again, unique comments included the following:


  • The respondent had no idea how anyone would track FTE for R&D.

  • The respondent did not know if the task could be done.

  • HR could not do this.

  • Internal discussion had not yielded a way to measure R&D FTE.


2.16.4 Defining Faculty and Non-Faculty R&D Personnel

Some universities asked for further definition as to who should be reported as faculty. Two asked if the intent was to include tenured and tenure track: One stated that they would use their tenure system, and the other stated that to them, “faculty” means tenured or tenure track. Most universities reported that they have personnel systems with codes for appointments, job titles, and/or rank. Some have tenure systems or codes related to tenure in their systems. One university stated that they did not think they have tenure in their faculty personnel database.


Specific examples of how the definition could affect responses include the following:


  • Some universities consider adjunct faculty to be faculty members, whereas others do not. One university stated that adjuncts are contract workers and not salaried employees, and that they are not considered faculty.

  • Two universities referred to classifying faculty according to instructional responsibility as their primary assignment. Two referred explicitly to 50 percent or 51 percent or more instruction as the definition of who would be considered faculty.

  • Three universities noted that administrators such as chairs and deans may have tenure and conduct R&D, but they are currently classified as administrators rather than faculty.

  • Two universities noted that some PIs would be excluded if tenured and tenure track were used as the survey definition because they have faculty who are not on tenure track and conduct R&D. Another noted that they have research faculty and “professors of the practice” in medicine who are not on tenure track.

Regarding nonfaculty personnel, two institutions mentioned that if NSF provided a definition of faculty, the nonfaculty column would not need a definition, it would just be all other R&D personnel.


Other respondents asked for further definition, e.g., two respondents asked how to classify students. Other issues associated with nonfaculty research personnel included a university noting that they do not have FTEs associated with hourly staff, whereas another stated that the hourly staff timecard functions as an effort report. A university with a medical school stated that all research staff are full-time 12-month employees. One university said that they would have to code the staff manually.



2.16.5 Question 18: Head Count

Nine universities stated that reporting head count would be easier than calculating and reporting FTE; one said head count would be easier, but not if broken out by field. However, several respondents qualified their comments by saying it was still difficult, was time consuming, or required manual work to produce head counts. One university stated that they would rather produce FTE than head count. Another stated that once FTE is produced, head count is there.


Because of dual appointments, a head count would involve double counting of some personnel. That is, persons would be counted in both fields in which they had appointments if they were doing R&D in both those fields.



2.16.6 Question 19: Postdoctoral Researchers

Three respondents reported that there is some ambiguity at their universities as to who is a research associate or visiting assistant professor and who is a postdoctoral fellow. Four respondents said that they have no attribute in their data system to identify postdocs, or they do not track this now and would have to add a code in their system or obtain input from another office.


Other respondents reported that they have information about who is a postdoctoral fellow, generally coded in their databases, and can report this information. One qualified this to state that they can report the data “if appropriately titled in the HR system.”


Finally, it was noted by two respondents that a postdoc can have a degree other than a PhD, for example, an MD.


Because of the question wording, a few respondents initially referred to having a PhD degree attribute in their personnel database—not keying specifically on postdoctoral researchers (postdocs) but rather on researchers with doctoral degrees. One person suggested moving up the reference to “postdocs” to an earlier place in the definition.



2.16.7 Alternate Questions 17 and 18

Because of issues identified with Questions 17 through 19 in the first set of site visits, alternate items were developed. One set (Appendix B, pages 27-28) was shown to the second set of institutions, and a revised set (Appendix C, pages 27-28) was shown to the remaining institutions. After discussions concerning Questions 17 through 19, institutional representatives were asked to examine and respond to the alternate questions.


The split between full-time and part-time staff in the initial alternate question (Appendix B) was cited as difficult by two respondents; one also observed that all of their faculty are full-time employees. The item was subsequently revised to eliminate this split and the revised version was shared with universities visited during late January and early February. As noted above, many universities stated that producing head counts, as requested in alternate Question 17, would be easier than producing FTEs, although several still described this as burdensome or difficult.


Three respondents said that including alternate Question 18, asking for the FTE associated with the total head count, was problematic because the requirement to report FTE remained.


Two respondents noted that FTE would reflect research effort, whereas head count would not.



2.17 Questions 20, 21, and 22: Proposals and Awards

Nearly all institutions visited can report on Question 20, number of proposals submitted. Only two institutions said they lack the capability to accurately respond to this question. One institution noted that R&D projects are not separately identified in their pre-award database and that it would be difficult to exclude public service or training projects, which may have research components. Another institution does not have a systematic method of tracking proposals, which can be “verbal” or even written after an award is given. This institution thought that the sum of funded projects and unfunded projects in a given year would be their best estimate of the number of proposals.


Only one institution was able to identify proposals with PIs from more than one department, although this issue was not raised at all institutions. However, most institutions with medical schools appeared to capture medical school proposals in their proposal databases. A few institutions noted that resubmitted proposals would count as new proposals in their system. Respondents at two institutions expressed confusion with the reference to “other documents or actions” in the question. These respondents said that the terms did not seem relevant to proposals.


All 17 institutions have the ability to report on Question 21, but a few potential problems were reported. One institution does not break out R&D from other types of sponsored activities but could report the total number and amount of sponsored awards (including those that were not R&D). Two institutions that conduct clinical trials noted that clinical trial award amounts are based on enrollment in the trial. The award amount would be difficult to report in these cases, since enrollment may not be finalized in the reporting year. Lastly, one institution noted that awards information is contained in multiple databases (covering government and private sources), which complicates providing a response.


Respondents at several institutions were confused about the instructions for reporting multiple year awards on Question 21. A few of these respondents said the question would be clearer if it asked for awards “actually received” or projects “funded this year.” Another institution suggested a distinction between the “budget period” and the “project period” to help clarify the question. Two institutions suggested breaking the question into multiple parts that would capture single-year and multiple-year awards separately.


Question 22 (collaborative awards) generated considerable confusion among respondents. At least seven institutions initially interpreted “collaborative awards” to mean subawards or other subrecipient arrangements. A few respondents took the question to mean internal collaboration between multiple faculty members. When the question was clarified, only 4 of the institutions indicated that they had participated in a collaborative project with another institution. Two of these institutions monitor and can report the number of collaborative awards.


Respondents suggested clarifying the question by providing further definition of collaborative awards. One institution suggested referring to “two separate higher education institutions receiving their own awards for a collaborative effort,” while another said that collaborative awards are those where all participating institutions receive funds “directly from the funding agency.” One institution cautioned against using the word “equal” to describe collaborative arrangements, as institutions may not take on equal roles in a project. A few institutions noted that collaborative awards were specific to NSF or NIH, which grants linked awards (designated as “L awards”). Respondents who were familiar with NSF collaborative awards noted that “collaborative” often appears in the RFP title.



2.18 Intellectual Property and Commercialization

NSF is exploring the possibility of collecting information on intellectual property (IP) and commercialization associated with R&D at universities. Site visit institutions with technology transfer or technology management offices (or in one case, a technology transfer coordinator in a sponsored programs office) were asked to provide feedback on draft questions on this topic. The site visit teams were able to meet with technology transfer representatives at nine institutions; in four other cases persons were not available or the site visit time was limited. The draft questions were distributed during the site visit and the interviewers and respondents walked through the questions together. The questions appear in Appendices A, B, and C.



2.18.1 Question 1: Intellectual Property Commercialization Transactions

Seven of the nine institutions stated that they could respond to this question about IP transactions fairly easily, and six stated that these are items they track and/or report on routinely. The two institutions that did not respond in this way were relatively small institutions with low levels of IP activity and were in the process of establishing an IP office or were in transition.


A. Disclosures. All of the nine institutions (including the central office of a large multi-university system) reported that they have a form or package required for a disclosure. However, they varied in whether they track predisclosures.


  • Predisclosures. One university reported that predisclosures may sometimes be assigned a disclosure number; they observed that universities vary in how early they give a disclosure number to an invention. This same university noted that the number of disclosures alone does not equal the number of quality inventions; some do not result in a viable technology.

  • Source of disclosure. One university suggested that the question be clarified, specifying that universities should report disclosures coming from internal researchers and suggested the wording “include inventions officially disclosed to your institution.”

  • Types of disclosures. A respondent at a land grant university noted that they have technology, plant variety, software, and copyright disclosures. At another university, disclosure types included technology, tangible property (e.g., cell lines), and copyrights. This respondent suggested asking about “disclosure of potentially patentable inventions” and excluding other types. He also noted that some software is protected by patents and some by copyright. At yet another university, the respondent asked whether the question was intended to cover items beyond inventions such as written material and processes.


B. Material Transfer Agreements (MTAs). A number of issues regarding MTAs were identified, despite initial reactions that Question 1 asked about items that universities can easily report on. One university noted that while their office deals with some MTAs, there are also vendors that handle university MTAs. A multi-university system office reported that several offices have authority concerning MTAs. Another university also noted the MTAs are not typically handled by the IP office. Two universities referred to outgoing versus incoming MTAs, and one referred to outgoing MTAs; one institution suggested treating these differently (i.e., collecting them distinctly). Three smaller institutions reported that they had none or few or were not familiar with them.


C. Licenses and Options. As noted above, two-thirds of IP respondents reported tracking the measures in this question, which included licenses and options. There was discussion at several sites about partially exclusive agreements. While three institutions reported that they understood the term and could report this information, two others questioned the meaning of partially exclusive agreements, and one stated that they do not track nonexclusive licenses. Alternate terms, each mentioned by one respondent, were limited agreement and co-exclusive agreement.


One respondent noted that a university may be a co-owner of a technology and the other owner may sign a license agreement. He asked whether these should be included. Another respondent suggested including letters of intent in this question. It was noted that an option may or may not turn into a license.


D. Active Licenses and Options. All of the institutions can report this information.



2.18.2 Question 2: Patent Applications and Patents Issued

The majority of respondents said they could provide the requested information about patents. Types of patents that were mentioned include utility patents, plant patents, and design patents. Utility patents were noted as the main type by one institution; some others commented that they have few plant patent filings; one stated that they do not do design patent filings, and another did not know if they did design patent filings. A land grant university mentioned plant variety protection (PVP), which is different from a plant patent. One respondent suggested specifying that respondents should include utility patents, whereas another suggested collecting information by type (utility, plant, design).


Other issues that were raised included the following:


  • One respondent noted that there are first filings and follow-up filings, and suggested that NSF might want to clarify the question by specifying “initial filing.” Another noted that there may be multiple filings in one year for the same invention.

  • In addition to patents, three universities mentioned Patent Cooperation Treaties (PCTs), which follow U.S. filing and provide protection in foreign countries until individual foreign patent applications can be filed.

  • Two universities reported that they were not familiar with continuations in part (CIPs) and one of these was not familiar with PCTs; these were smaller institutions with limited IP activity.


2.18.3 Question 3: Patent Filings Resulting from Federally Funded R&D

Seven of the nine institutions stated that they could report this information, with most stating that they track all funding sources. Five respondents mentioned databases, while others mentioned disclosure forms. One additional participant, who did not specifically state that the institution could report this information, did say that he thought the disclosure form or patent application form would identify funding sources. In one case, this question was not asked.


A respondent noted that there are frequently multiple sources, including industry and university cost sharing, along with federal funding. One respondent noted that they would report any patents or inventions from federally funded R&D, from 5 percent to 100 percent.



Master Agreements

Four of the universities indicated that they do enter into master agreements with industry for IP rights on multiple patents. Two indicated that they do not. One university was not familiar with master agreements.



2.18.4 Question 4: Start-Up Companies

Seven of the nine institutions stated that they could report information on the number of start-ups involving the licensing of university technology. However, the term “dependent on” was problematic for two institutions. One university system respondent noted that the university may not know about other IP licenses that a start-up company has and noted that some technologies depend on several things coming together. Three respondents suggested clarifications to the language defining start-ups. One respondent suggested the phrase “were formed to use your institution’s technology.” Another suggested “formed to commercialize university technology to the marketplace,” and yet another suggested “were started based on university-licensed technology.”


One respondent noted that they would have to count the start-ups in a given fiscal year based on the date that the technology was licensed to the start-up, but the company could have been formed previously. Another noted that date of incorporation and date of license may not be the same. Yet another university raised this point, but stated that in their case the number would be the same for 2008 no matter how the question was asked. Finally, a university noted that they might award many licenses to a company, but it would be considered a start-up only in the first year.



2.18.5 Question 5: Detailed Information on Start-Up Companies

Participating universities raised a number of issues related to the items asked about start-up companies.


Five institutions stated that they do not have incorporation dates for start-up companies. This is related to the responses noted for Question 4 in which the fiscal year would be based on when the license was given to the start-up rather than formation date.


Five universities, including one large system, noted that they have confidentiality agreements with at least some start-ups that would prevent them identifying the company by name. In some cases, such confidentiality agreements might cover other information as well, such as the technology area or location. One university said that they could report technology sector and ownership stake, but not name, location, or date of incorporation.


One university observed that some start-ups have multiple locations.


The wording concerning ownership stake was discussed by a number of university representatives. Respondents mentioned equity, stock, options, and warrants in connection with “ownership stake.” Two participants offered alternative language: One respondent suggested asking, “Does your university hold equity in the start-up?” The other suggested using “equity stake” rather than “ownership stake.”



2.18.6 Question 6: Income From Intellectual Property Transactions

Most respondents stated that they have records reflecting the types of income from IP listed in the question; one small institution stated that they have no income from IP yet, and another stated that this may be in their accounting system. One respondent noted that they report similar information in the Association of University Technology Managers (AUTM) survey and suggested an approach close to AUTM to avoid reporting the same information in different ways. Specific issues were raised primarily with the sale of intellectual property and the income associated with equity holdings.



Sale of IP

Four institutions stated that they do not sell intellectual property and one stated that they can only sell IP that has been donated to the institution and generally sales are zero.



Equity Holdings

Five institutions noted that it is hard to estimate the value of equity holdings or that they only have value when sold. They can report the value of liquidated equity. One respondent suggested asking for “income realized from disposed equity shares.” Another stated that the words “disposition/sale of equity holdings” would be language they are familiar with. Two institutions stated that they do not receive dividends from equity holding in start-ups.


One university was unfamiliar with the term “running royalties.”


A source of income not specified is income from litigation. This was mentioned by two institutions, one of which stated that while not common, it is substantial when it occurs; one respondent stated that they would include this under license income. Others noted that there are also “milestone fees” and “trigger payments” under licenses, which are periodic payments based on specified events in the commercialization cycle. Patent cost reimbursement was also cited as an income source by two institutions, one of which stated that it was separate from license fees and substantial for their university.


Finally, one respondent suggested using the word “revenue” rather than “income,” and another suggested using “receipts under options” rather than “payments under options.”



2.18.7 Question 7: U.S. Patents Filed and Issued, by Field of R&D or Department

Two versions of Question 7 were presented to respondents during the site visits (see the IP modules in the appendices). Version 1 asked respondents to report the numbers of U.S. patents filed and issues by field of R&D; version 2 asked respondents to provide information by discipline based on the department of the inventor. The procedure involved distributing one version of the question, discussing it, and then distributing the other version. The version presented first was alternated across institutions in a given set of sites.



Field of R&D Versus Department

Six institutions reported that allocating by field of R&D would be problematic; one noted that it would require a judgment to assign patents to fields; one asked that NSF provide additional guidance on how to do so. Several noted that a manual review of patent documents would be required. The approach of using field of R&D is not consistent with the way these institutions keep their records. One institution reported using key words, whereas most stated that they track the department of the inventor/PI. Another cited a conceptual difficulty, giving as an example the difficulty of drawing a line between mathematics and computer science.


The use of the department of the inventor/PI (version 2) was preferred by most respondents (six), and some had stated that they would report field of R&D (version 1) that way in any case (based on the department of the inventor). Two noted that field of R&D is more meaningful, but department is easier.


Three noted that the departments do not match the disciplines as listed, so there would be a process of mapping them, in some cases manually. Another issue raised by two institutions, including a system office of a multi-university system, is that some PIs/inventors have more than one departmental appointment (in which case they may choose to designate one department for the patent form) or they have appointments in multidisciplinary centers that are hard to classify. One respondent noted that patents are linked to inventions at their institutions, not PIs or departments. One respondent stated that it would be easier to report by school (medicine, engineering, etc.) rather than field or department.



Allocation of Patents Across Multiple Fields

Several participants cited challenges associated with allocating patents across multiple fields. In general, however, each type of concern was raised by one or two respondents.


Two participants stated that they choose one department (which one referred to as the “inventor’s field”) for a patent and only record that one in their records.


One stated that it would be difficult to identify contributions of various departments. Another felt that prorating would incorrectly imply equal credit and asked for additional guidance on prorating across fields. One university said that they would have double counting, as their records would show one patent for each department; however, they could provide an unduplicated total number of patents.



Instructions

Some clarifications to the item were requested. One respondent suggested that the instructions explicitly exclude provisional patents. Further definition of the departments or disciplines encompassed by a field category was requested; the respondent completing an IP component may not see the entire instrument and the allocation of fields given in Question 9.



2.18.8 Other Comments and Observations About IP

Four institutions noted that at least some questions about IP do not match up well with their current record-keeping systems or that there would be a burden to the institution to conduct manual processes, review documents, or change a database to respond to the questions. Three referred to reporting by discipline as “difficult.” None quantified the level of effort involved. On the other hand, two observed that the IP questions are not very different from their annual reports.


Alternative measures that were suggested included numbers of inventions and jobs created in start-ups.


Areas of sensitivity that were noted by respondents were primarily focused on detailed questions about start-ups (discussed above). However, other information was not cited as particularly confidential.


One large university system respondent observed that it is better to talk about the benefits of technology transfer, but that it is easier to ask about patents and dollars. The reporting of income from IP appears to show a large amount of income, but there are also very high costs associated with creating and maintaining patents. IP income is not an important source of income to that university system. They were concerned that publishing this information would give the wrong impression. The costs associated with IP were cited by another respondent.



2.19 Multiple Campus Institutions

Three multiple campus institutions were asked to give an overview of their structure and reporting capabilities.2 All three were part of state university systems. Two interviews were held at system flagship campuses, and the third was at the system office overseeing all campuses. All three multiple campus institutions have a board of governors that oversees the entire system. Below the board is a university system president, who in turn oversees chancellors at the campus level.


Two of the three university systems currently report each campus separately on the R&D expenditures survey. The survey is managed at the campus level for one system, while at the other the university system’s central office coordinates the survey response for all campuses. The third system has four campuses, but two campuses report jointly and two report separately. The campuses that report jointly are the flagship campus and the medical campus, which are located in different cities. Most of the R&D activity for the system takes place on these two campuses, each of which has a chancellor but shares a single accreditation. Respondents said that it would be possible to report the two campuses separately, but they would prefer to keep them together because this is how they are represented in their accounting system.


All three systems had medical schools or campuses, but the arrangement of these components within the institution varied. As noted above, one system has a single medical campus where medical training and research takes place. The other two systems have multiple medical schools. One system has a central medical school on one campus and satellite medical schools on other campuses. The administration of the satellite medical schools is handled primarily by the central medical school. The third university system has five campuses with medical schools, which are administered at the campus level.


Two institutions noted that some research initiatives cut across campuses. One institution said that these “state-wide initiatives” are overseen by vice presidents at the system level. The other institution indicated that R&D activity in multi-campus research units (MRUs) can be broken out by individual campus where work actually takes place. This institution also has a number of other campus-based research institutes and centers, as well as national laboratories. However, the national laboratories are not considered part of the institution for reporting on the survey.



2.20 Preferences for Notification of Survey Changes

Of the 17 sites that participated in the various rounds of cognitive testing, 10 sites were asked about how they would prefer to be notified of the upcoming changes. Time constraints prevented the other seven sites from being queried about their preferences.


Of the 10 institutions that were asked, 4 expressed no preference between receiving the changes all at once or gradually.


  • One participant said that they did not think the revised taxonomy would affect them much.

  • Another participant said that they would adapt to changes gradually without much problem.

  • Another said “it doesn’t matter.” They could adapt to piecemeal changes or all at once changes.

  • Another said that they could make changes as they go—some for 2010, some for later. They are already making changes.


The other six institutions said that they preferred changes to come all at once.


  • One participant said that they would prefer finalized rather than incremental change. Additionally, they would prefer the taxonomy being finalized before they are notified of any changes.

  • Another participant said that they would prefer all the changes at once. An unanticipated labor-intensive change could cause so much work that they would prefer to do it all at once.

  • Another participant said that they would prefer to get the changes all at once but would prefer a phased-in implementation.

  • Another participant said that they would prefer to get the changes all at once. Furthermore, the respondent stated that it would be a problem for them if the disciplines are significantly different from IPEDS or the Delaware Study.

  • Another simply said that all the changes at one time would be better.

  • Another also preferred all the changes at once, after 2010.


Seven sites provided closing remarks about future changes and their ability to complete the survey.


  • One site said that a closing date in March would be better for them because at present, the survey coincides with their A-133 audit. After the auditors leave campus would be a good time for them to focus on the survey.


The amount of time that sites would need to prepare for the changes varied greatly from a few weeks to three years.


  • For a change the magnitude of the FTE, one site said that they would need three years to prepare and the success of the endeavor would depend on the availability of IT staff (staff outside of their department).

  • One site said that to accomplish the changes to their system that would be required to provide the data, they would have to hire outside consultants to work on their databases. The university budgets are often drawn up two years in advance. This site would need a few years’ notice for their data needs to get into the university’s budget.

  • Three sites said that an advance letter summarizing the changes or an executive summary of the changes would be helpful.

  • One site said that they need advance notice and did not care what form it came in.

  • Another site said that email was a good way to communicate with them. Some changes, like the basic/applied, would take them at least a year to develop systems to comply.

1 Some very minor protocol changes were made between groups 3 and 4; the group 4 protocol appears in Appendix C.

2 A few other institutions indicated that they had satellite campuses, but that these did not conduct any R&D.

File Typeapplication/msword
File TitleQuestions 3 and 4: Medical school R&D and clinical trials
AuthorLiam Ristow
Last Modified ByMary Hagedorn
File Modified2009-04-03
File Created2009-04-02

© 2024 OMB.report | Privacy Policy