Supporting Statement Part A

Supporting Statement Part A.docx

Office of Inspector General Review of Awardee Implementation of NSF's Requirement for a Responsible Conduct of Research Program

OMB: 3145-0227

Document [docx]
Download: docx | pdf

Supporting Statement A

for

Office of Inspector General Review of Awardee Implementation of NSF’s Requirement for a Responsible Conduct of Research Program



Name: Dr. Aaron Manka

Address: Office of Inspector General, National Science Foundation, Suite II-705, 4201 Wilson Blvd, Arlington, VA, 22230

Email: [email protected]

Telephone: 703-292-5002

Fax: 703-282-9159




A.1. Circumstances Making the Collection of Information Necessary

The America COMPETES Act section 7009 (codified at 42 U.S.C. § 1862o-1) required the National Science Foundation to ensure that “each institution that applies for financial assistance from the Foundation for science and engineering research or education describe in its grant proposal a plan to provide appropriate training and oversight in the responsible and ethical conduct of research....” NSF’s implementation of this requirement is described in the NSF Proposal and Award Policies and Procedures Guide, Part II - Award and Administration Guide, Chapter IV, Part B and is available at nsf.gov/pubs/policydocs/pappguide/nsf11001/aag_4.jsp#IVB.

This information collection is necessary for review of institutional compliance with the responsible conduct of research (“RCR”) requirement. NSF OIG will use the data collected to inform the Foundation and Congress whether current responsible conduct of research programs comply with the COMPETES Act and NSF’s requirement, and to make recommendations to NSF to strengthen these programs if necessary. In particular, there is a similar requirement by the National Institutes of Health for awardees to have an RCR training program, and the approach is quite different from NSF’s. It would be useful for NSF to know whether there are elements of NIH’s RCR implementation that NSF should consider implementing as well. The results of the information collection will also assist NSF OIG in developing an RCR oversight plan.

A.2. Purpose and Use of the Information

The purpose of the proposed information collection will primarily address how awardees have implemented NSF’s RCR requirement by interviewing three groups of people at NSF awardees: 1) upper-level administrators (e.g., Vice Presidents or Vice Provosts), 2) program administrators (e.g., Research Integrity Officers or Compliance Officers), and 3) trainees who have participated in the program (undergraduate students, graduate students, and postdoctoral researchers). From the upper-level administrators, we will request information that will allow us to assess the institution’s commitment to the program, including resources (both financial and staff), and how the expectations for the program are communicated to faculty and students. We will request from the program administrators information such as course structure and content, participation requirements and options, compliance tracking, faculty participation, resource allocation, and oversight. From the course participants, we will request information about their experiences in the courses with regard to format, duration, content, and the benefits and drawbacks of taking an RCR course. The information collection will be conducted through video-conferencing between NSF OIG and awardee participants.

This information will be used for NSF OIG’s effective oversight of NSF programs and operations by reviewing awardees’ compliance with the RCR requirements of the America COMPETES Act and NSF’s Proposal and Award Policies and Procedures Guide. This collection will be used for accountability and evaluation purposes and to inform NSF and Congress on the outcome of the information collection.

A.3. Use of Automation

New data collection for the purpose of the evaluation will consist of requested information relevant to the RCR program followed by video/telephone interviews with selected academic administrators and students. The survey will be administered using appropriate information technology. The detailed and context-specific nature of the information to be collected via interviews precludes use of automation, but interviewers will be respectful of the burden on their subjects and will keep the interviews as short as possible.

A.4. Efforts to Identify Duplication

Every effort will be made to identify information relevant to the program that can be collected from existing sources prior to the interview of respondents. Before scheduling interviews with respondents, we will review the awardees’ (respondents’) webpages for information relevant to its RCR program; we will then request only the programmatic information not present on the webpages. Information collected from these existing sources will not be duplicated during the interviews.

A.5 Impact on Small Businesses or Other Small Entities

No small businesses will be involved in this study.

A.6. Consequences of Less Frequent Collection

This will be an ongoing data collection in which respondents will be asked to provide specific documentation and participate in a single video/telephone interview. We anticipate collecting data from different respondents approximately once per week. If we collect data less frequently, it will extend the duration of our effort as we are looking to collect information from approximately 100 awardees.

A.7. Special Circumstances for Collection

The proposed data collection fully complies with all guidelines of 5 C.F.R. § 1320.5.

A.8. Federal Register Notice and Outside Consultation

As required by 5 C.F.R. § 1320.8(d), comments on the information collection activities as part of this study were solicited through publication of a 60 Day Notice in the Federal Register on September 14, 2012, at 77 FR 56876. We received three comments, to which we here respond.

Commenter 1

We agreed with one commenter’s conclusion that a) the information is necessary and will have practical use; and b) our estimated burden on respondents appears appropriate. In c), the commenter raised two points, and one more in d) which we address here.

The first point of c) is about our non-inclusion of the actual course instructor in our survey. We did not specifically include interviews with instructors for two reasons. The first is that NSF does not require grantees to provide RCR instruction through a live person—NSF concluded it was acceptable for grantees to direct participants to a website for online RCR education. Thus, there may not be an RCR instructor with whom we could speak. The second reason is that, based on our limited experience with grantees in which live RCR instruction is offered, the RCR administrator is also involved in that instruction, so the administrator will also have that perspective in those instances. Finally, we want to limit the burden this survey imposes on awardee institutions.

The second point in c) is that our minimum number of participants of the RCR training (three—one undergraduate, one graduate, one post-doc) seems too low to provide a representative sample. We will ask grantees to make available as many students as practical, but since NSF requires grantees to provide RCR training only to students directly supported (paid) from NSF grants, we recognize that for many grantees, this may mean that few NSF participants exist. Of course, if a grantee provides RCR education to a broader range of students/post-docs/faculty than the minimal requirements of NSF, we expect to be able to draw from a larger pool of participants. Indeed, this is one of our questions for the RCR Administrator.

The comment in d) about the most significant way to reduce the burden on respondents would be to give clear and timely guidance on what does and does not constitute ‘adequate’ training goes to one of the points of doing this survey. NSF has not specified what constitutes ‘adequate’ RCR training. We are assessing how grantees have implemented NSF’s requirement, how many of them would welcome further specificity in NSF’s requirement, and how many would not—and why or why not. As we note, one likely outcome of our effort would be recommendations back to NSF for improving its RCR program, and, depending on the response data, this could be one of those recommendations.

Commenter 2

Commenter 2 expressed concern that our RCR program data collection strategy “exceeds what is necessary to evaluate recipient’s compliance with NSF’s policy” and “creates an unnecessary and excessive burden on the respondents” and that the interviews “are not necessary nor useful”. We prepared our approach after interviewing experts in RCR training and then conducted a trial run of the oversight program at a university with multiple, decentralized RCR programs. Using a draft questionnaire, respondents provided answers and promptly offered both positive and negative feedback about their own RCR training experiences. Indeed, they expressed to us a desire to have additional discussions beyond the interviews, which we accommodated. Our interviews and questions were necessary and essential to determine compliance with NSF’s RCR policy, to allow us to address the impact of NSF’s requirement on the university, and to determine whether a recommendation to adjust the policy might be warranted. Thus, your phrase “unnecessary and excessive burden” is quite opposite of our actual experiences while interacting with upper-level administrators, RCR administrators, and RCR course participants.

Another point raised was that the RCR policy “does not require institutions to demonstrate a commitment particularly through separately allocated resources – financial and/or personnel – to the program”. However, there is a requirement to allocate personnel. As indicated in the NSF Proposal and Award Policies and Procedures Guide, “An institution must designate one or more persons to oversee compliance with the RCR training requirement”.1 As indicated in our Federal Register Notice for this review, for evaluation purposes we are interested how the institution’s financial and staff resources are both utilized to maintain the RCR training program.

There was also an overall concern that the length of time estimated for interviews is not enough. As indicated above, our interview times are based from our previous experiences and are used as an estimate, not as an absolute fixed factor. We expect that interview duration would vary for some institutions based on the size of their RCR program and total number of participants. Individual institutions can have a wide variety in the number of trainees who are supported by NSF. Our estimated interview times are in line with the actual length of the interviews conducted in our trial run.

Commenter 2’s statement that the “NSF OIG lacks the breadth of expertise needed to reasonably assess the effectiveness of individual institutional programs” misses the mark of our intent. Our goal is not to evaluate the effectiveness of individual institutional RCR programs, but rather to evaluate an institution’s methods for implementing its RCR program in response to NSF’s requirement. As Commenter 2 stated, “There is not a required course content or structure nor a requirement that faculty participate in the training activities”. Institutions can freely develop their RCR training plans, and, as stated in our Notice, we seek to collect such information for evaluation purposes. Our staff has several scientists who have the requisite experience to complete such an evaluation.

We agree with Commenter 2 that receipt of an institution’s plan for RCR training would be a valuable endeavor, and we will obtain such institutional plans as part of our assessment.

Commenter 3

  1. We agree with Commenter 3 that receipt of an institution’s plan for RCR training would be a valuable endeavor, and we will obtain such institutional plans as part of our assessment. We prepared our approach after interviewing experts in RCR training and then conducting a trial run of the oversight program at a university with multiple, decentralized RCR programs. Using a draft questionnaire, respondents provided answers and promptly offered both positive and negative feedback about their own RCR training experiences. Our interviews and questions were necessary and essential to determine compliance with NSF’s RCR policy, to allow us to address the impact of NSF’s requirement on the university, and to determine whether a recommendation to adjust the policy might be warranted.

This commenter suggested we use an electronic survey rather than conducting interviews to gather information. During our trial run, we specifically asked the participants how their responses would differ if they received and answered the same questionnaire electronically vs. in an interview. While a couple of interviewees noted it would be more convenient logistically to complete an electronic questionnaire at their leisure, all interviewees preferred an interview format for a more fruitful discussion.

  1. Commenter 3 suggested our list of interviewees is incomplete because we exclude faculty. We do not specifically exclude faculty as we found that faculty members are often the RCR program administrators and/or RCR course instructors. Furthermore, we plan to ask RCR program administrators for information on faculty involvement. We realize faculty mentoring could be an integrated part of a RCR program, as we recognize that institutions have varying RCR training programs that are suited to their specific research disciplines or type of institution.

Commenter 3 believed we underestimated the time burden on institutions due to systematic auditing and self-assessment. We are not conducting an audit, nor do we require a university to conduct an audit or self-assessment either prior, or subsequent, to our information gathering. Our interview times are based from our previous experiences and are used as an estimate, not as an absolute fixed factor. We expect that interview duration would vary for some institutions based on the size of their RCR program and total number of participants. Individual institutions can have a wide variety in the number of trainees who are supported by NSF. Our estimated interview times are in line with the actual length of the interviews conducted in our trial run.

After consideration of the comment, we are moving forward with our submission to OMB.

Regarding outside consultation, in the course of designing our questionnaire, we consulted outside, technical expertise from several RCR experts at universities and a professional society representing universities with an interest in RCR education.

A.9. Gifts or Remuneration

No payment or gift will be made to respondents as a part of this study.

A.10. Confidentiality Provided to Respondents

Prior to our video/telephone interview, respondents will be informed that no individuals will be identified in our report. Participants will be informed that the information they provide will be kept confidential, except as required by law; data collected from them will be reported in an aggregate form; and their participation is completely voluntary.

A.11. Questions of a Sensitive Nature

The interview questionnaire does not contain questions of a sensitive nature. Personally Identifiable Information gathered as part of this study will be limited to the names of respondents as represented by upper-level administration—we will not seek the names of student respondents. Records of individual responses (in the form of interview notes) will be maintained in the OIG’s confidential files and will be kept confidential to the extent allowed by law.

As described in A.10, survey participants will be told of the general nature of our questions prior to participation, and interview respondents will have the option of not participating. Additionally, even after agreeing to participate, respondents may choose not to answer any particular question.

A.12. Estimates of Burden and Annualized Costs to Respondents

As summarized in Table A.12.1, the estimated number of awardees from which we will collect information will be 100 (with a minimum 93% survey response rate2) over 2 years, or 50 awardees per year. As noted above (A.4), to reduce duplicate information, we will first look for information posted on awardees’ webpages. If the requested information is not posted, we will request it prior to conducting any interviews. Since awardees are already required by NSF to track most of this information, we estimate its production should take no more than 0.5 hours; making the scheduling arrangements could take another 0.5 hours. Thus, the total administrative time burden is approximately one hour.

The expected burden for interview participants will vary depending on the respondent’s roll in the awardee’s RCR program. We estimate the interview time with a senior-level administrator to be approximately 0.5 hours, with the RCR program administrator approximately 1.5 hours, and with students and postdocs approximately 1 hour each in a group setting. We estimate each awardee will require 3 hours to complete interviews, so 4 hours for the total effort. Thus, to collect information from 50 awardees, we estimate the total time burden as 200 hours per year (4 hours/awardee × 50 awardees/year).

Our estimate of the cost to respondents is represented in Table A.12. We anticipate a pre-interview information request to be handled by an administrator such as an Administrative Specialist, with an average salary of $45,000.3 The average salary for a Vice Chancellor/Vice President is $190.0004; we estimate the average salary for an RCR Administrator is $126,6215; the average salary for a post-doctoral research associate is $47,6026; the average salary for a graduate research assistant is $24,4657; and undergraduate students are generally unpaid. The corresponding average hourly salaries8 are $21.63 for Administrative Specialists, $91.35 for Senior-level administrators, $60.88 for RCR administrators, $22.89 for postdocs, and $11.76 for graduate students. Thus, as shown in Table A.12 below, the annual cost to each respondent is estimated to be at least $193.28, with the minimum representing only one graduate student and one postdoc participating in the interview.

Table A.12. Annualized Estimate of Burden per awardee


No. of Respondents

Est. person-hours

Est. hourly wage

Est. annual cost to awardee

Pre-interview information request

1

1

$21.63

$21.63

Senior-level administrator

1

0.5

$91.35

$45.68

RCR administrator

1

1.5

$60.88

$91.32

Students9 and postdocs

3

3

$11.76 + $22.89

$34.65

Total per awardee

6

6


$193.28



A.13. Capital/Startup Cost

There are no Operating or Maintenance Costs to report.

A.14. Annualized Cost to the Federal Government

Total annual cost to the Federal Government for this data collection includes the government staff time to manage and support the collection of information and the cost to respondents described in A.12. Since OIG staff will participate in all the interviews, they will spend the same four hours per awardee. Additionally, OIG staff will review the information collected by the awardee, which will take approximately one hour. Thus, for the 50 reviews we expect to conduct per year, we estimate approximately 250 hours of OIG staff time per year will be associated with the conduct of this study. Using an average annual salary of $115,000 ($55.29 average hourly salary) for OIG staff who will participate in this activity, this totals approximately $13,822.50 per year.

A.15. Changes in Burden

This is a new collection of information.

A.16. Publication of Collection

Planning for this study began in March 2012. Collection and analysis of data not requiring OMB clearance, including interviews with select community members, has already been conducted. Once OMB clearance is granted, the survey and interview data collection will occur over two years, likely 2014 and 2015. Data will be analyzed and a draft evaluation report will be developed during 2016. Survey results will be assessed and tabulated; given the nature of the information to be collected, complex statistical analysis will be neither feasible nor desirable. The evaluation report will be finalized and delivered to NSF in 2016.

Table A.16. Estimated Project Schedule

Activity

Time Schedule

Collect and analyze data from existing sources and interview community members

Completed

Field user survey

After OMB approval; likely 2014

Conduct interviews with awardees

Within 3 years of the OMB approval

Analyze data and develop draft evaluation report

Within 6 months after completion of the previous step; likely 2016

Finalize evaluation report

Within 6 months after completion of the previous step; likely 2016



A.17. Display of OMB Approval Number and Expiration Date

No exceptions are sought; the required OMB information will be displayed on the initial contact letter and reiterated orally prior to conducting the interviews.

A.18. Exceptions to Certification Statement (19) on OMB 83-1

No exceptions are sought from the Paperwork Reduction Act or from form 83-I.

1 Part II-Award and Administration Guide, Chapter IV, Part B.2.c, nsf.gov/pubs/policydocs/pappguide/nsf11001/aag_4.jsp#IVB

2 We calculated a sample size of 93 surveys would provide a 95% confidence level with a 10% confidence interval (margin of error).

3 From a 2011-2012 survey administered by the College and University Professional Association for Human Resources: http://www.higheredjobs.com/salary/.

4 Ibid.

5 Ibid. Because RCR Administrators could be tenured professors (average salary $92,800), Associate VPs for Research ($161,618), or a Compliance Officer, which we assumed as equivalent to a Chief Information Officers ($125,446), we averaged the salaries of these three positions to arrive at an average yearly salary.

6 For postdoctoral researcher data, we used the average of the pay range on Payscale: http://www.payscale.com/research/US/Job=Postdoctoral_Research_Associate/Salary

8 We calculate the hourly rate by dividing the annual salary by 2080 hours.

9 Presuming one paid graduate student and one unpaid undergraduate student.

6



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleHow To Prepare a Request For OMB Review
SubjectHow To Prepare a Request For OMB Review
AuthorBrierlyE
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy