HERD Questionnaire Passback

OMB HERD passback - SRS response 10-05-10.docx

Survey of Research and Development Expenditures at Universities and Colleges, FY 2006 through FY 2008

HERD Questionnaire Passback

OMB: 3145-0100

Document [docx]
Download: docx | pdf

SRS Response to OMB HERDS Passback

10-05-10


  1. Can SRS provide a written response to the letter from the Bureau of Economic Analysis as part of the public comment?  Specifically, is SRS considering the additions that BEA requests?


We received this letter as part of the public comment for the FY 2009 pilot survey, and have been working with BEA throughout the redesign to respond to as many of their data needs as possible. See attached for our official response to each of their requests in Attachment 2 of their February 24, 2009 letter of support.



  1. How will SRS monitor burden during this round of survey administration to be sure that it is aligned with its expectations from the pilot?


We are prepared to add a burden question similar to the question on the FY 2009 pilot survey if OMB agrees. Alternately we can select a sample of institutions to request tracking of burden hours and develop a representative estimate. We request guidance from OMB on their preference.



  1. Is the statement on the front of the questionnaire that begins “Your institution will be treated equally” one that SRS has used before?  We wonder how it can make such an assurance given that the data are not confidential and could therefore be used by anyone, including other parts of NSF, for a variety of purposes that SRS doesn’t control.


We have not used that exact statement in past surveys. The statement on the previous survey was as follows: “Your response is entirely voluntary: your failure to provide some or all of the information will in no way adversely affect your institution.” To avoid overstating our control, we propose shortening it to “Your institution’s response is entirely voluntary.”



  1. Headcount of R&D personnel and postdocs – The personnel question looks likely to collect valuable data, but we have some concern about the extra burden of the postdocs question. If postdocs are being counted already by another SRS project, and this survey question will only capture a subset of the total postdocs, then why would this survey ask this question only to get incomplete answers? 


The purpose of this item is not to obtain a count of postdocs at academic institutions, which as OMB points out, NSF collects (from a much broader set of institutions) in another survey, the Survey of Graduate Students and Postdoctorates in Science and Engineering (GSS). The purpose of this item in HERD is to obtain information about how large a component of the academic R&D workforce (the total obtained in the preceding question, Question 16) postdocs represent.  Little is known, although there is much speculation, about how important postdocs are in terms of the R&D being conducted in U.S. institutions of higher education.  The information obtained from Question 17 will provide valuable insights into this issue.   



  1. Interdisciplinary R&D - Are these data supposed to be separate from the disciplinary R&D expenditures in Question 12? We don’t see how these data can be useful, since the survey specifically does not break out these data into topics like nanotechnology, etc. It’s our understanding that data users wanted information on specific multidisciplinary research areas, not just one lump category of multidisciplinary. Also there are opportunities for confusion here: if a university has an interdisciplinary project, it’s supposed to divide the total among the question 12 categories and also respond in Question 13. Is this requirement clear? Does this redesigned question truly respond to the ‘mixed success’ of the earlier question? Same concern regarding the conflicts between Question 9 and 13.



Subsequent to our initial submission of the HERD survey clearance request in July, we decided to delete the question on interdisciplinary R&D for the reasons you cite. This question is no longer included on the FY 2010 survey form. See attached for an updated form.



  1. Basic, applied, and development – While we understand the need for NatPat basic, applied, and development data how is it meaningful to ask universities to further divide their R&D portfolios, especially when they are not coded by category in university data systems?   Allowing overall estimates to be made does not give us great confidence on the quality of the data. The existing basic vs. applied / development is already problematic; we are concerned that asking universities to arbitrarily split applied and development presents even more problems and further weakens the validity of the data collected.  



Because these data are crucial for National Patterns estimates, we have been actively encouraging institutions to code this information at the project award level. We have seen an increase in the number of institutions who do code field of research and character of work at the award level, and we continue to suggest this type of coding during each site visit and interview conducted with the surveyed institutions. We expect the data quality for this question will continue to improve as we move towards project level reporting at more and more institutions.

We also found that many of the pilot schools reevaluated their basic research percentage in light of the new question providing more detail and examples of each type. Many revised their estimates substantially. Also, by removing the confidentiality promise for this question and releasing institution level data publicly, we hope that institutions will be motivated to provide more accurate responses.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRonda Britt
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy