Supporting Statement

NCSES_TO22_OMB_Supporting_Statment.docx

National Science Foundation Surveys to Measure Customer Satisfaction

Supporting Statement

OMB: 3145-0157

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT FOR PAPERWORK REDUCTION SUBMISSION

NSF 2011 Survey of PrincipAL Investigators and REVIEWERS

A. JUSTIFICATION

  1. CIRCUMSTANCES MAKING COLLECTION OF INFORMATION NECESSARY

On September 11, 1993, President Clinton issued Executive Order 12862, “Setting Customer Service Standards,” which clearly defined his vision that the Federal agencies will put the public first. To accomplish this, President Clinton called for a “revolution within the Federal government to change the way it does business.” He expected this process to require continual reform of government practices and operations to the end that, “when dealing with the Federal agencies, all people receive service that matches or exceeds the best service available in the private sector.”


Section 1(b) of this E.O. requires agencies to “survey customers to determine the kind and quality of services they want and their level of satisfaction with existing services” and Section 1(a) requires agencies to “survey front- line employees on barriers to, and ideas for, matching the best in business.” These Presidential requirements established an ongoing need for the National Science Foundation (NSF) to engage in an interactive process of collecting information and using it to improve program services and processes.


Consistent with E.O. 12862, the purpose of the NSF 2011 Survey of Principal Investigators and Reviewers is to obtain information on the experiences and views of the members of these two critical NSF stakeholder groups with regard to the way NSF defines, communicates, and applies the two merit review criteria to its portfolio of increasingly complex and interdisciplinary research proposals. The survey will collect information that will be used to identify the advantages and disadvantages of the criteria, the usefulness of information and feedback NSF provides regarding the criteria, the affects the criteria have on the research it funds, and the changes, if any, that should be made to the criteria and how they are applied.



  1. HOW, BY WHOM, AND PURPOSE FOR WHICH INFORMATION IS TO BE USED

The National Science Board established a Task Force on Merit Review that is charged with examining the two NSF Proposal Merit Review Criteria and their effectiveness in achieving the goals for NSF support of science and engineering research and education. In carrying out this charge, among other things, the Task Force will be assessing how the two NSF Merit Review Criteria – Intellectual Merit and Broader Impacts – affect PIs in the scientific community as well as researchers who serve as NSF proposal reviewers. Responses from the NSF 2011 Survey of Principal Investigators and Reviewers will be used by the Task Force to assess the experiences and views of individuals in these two important stakeholder groups regarding the current use and application of the two NSF Merit Review Criteria. The information obtained from this survey will play a critical role in the Task Force’s deliberations on what, if any, changes are needed in how the NSF Merit Criteria are defined. The results of this survey will be incorporated in the Task Force report, scheduled for release in 2011, which will include recommendations on defining and describing appropriate merit review criteria. The results of this survey will be made available on the NSF website in conjunction with the release of the Task Force report.


  1. USE OF AUTOMATION

There are no legal or technical obstacles to the use of technology in these information collection activities. The NSF 2011 Survey of Principal Investigators and Reviewers will be administered via the Internet, which will allow for a more convenient and less costly survey administration than a paper survey.


  1. EFFORTS TO IDENTIFY DUPLICATION

The information to be collected by the NSF 2011 Survey of Principal Investigators and Reviewers does not duplicate any other information collection. After consulting with various senior NSF officials, members of the NSB Task Force on Merit Review have concluded that the information sought in this survey, which is critical to the Task Force’s efforts, is not already available.


  1. SMALL BUSINESS CONSIDERATIONS

Not applicable.


  1. CONSEQUENCES OF LESS FREQUENT COLLECTION

Not applicable.


  1. SPECIAL CIRCUMSTANCES FOR COLLECTION

Not applicable.


  1. FEDERAL REGISTER NOTICE.

The agency’s notice, as required by 5 CFR 1320.8(d), was published in the Federal Register on January 21, 2008, at 73 FR 3756 and no substantive comments were received.


  1. OUTSIDE CONSULTATION

SRI, International is the consultant to the NSF on its NSF 2011 Survey of Principal Investigators and Reviewers and is responsible for survey design, administration, and analysis. In the course of this work, SRI, International is providing outside expertise on issues of survey design and methodology, including the frequency of collection, the clarity of instructions, and data elements to be gathered.


  1. GIFTS OR REMUNERATION

Not applicable.


  1. CONFIDENTIALITY PROVIDED TO RESPONDENTS

The NSF 2011 Survey of Principal Investigators and Reviewers will be a confidential survey. The instructions to the survey will clearly assure participants that all responses will be held in confidence. “There will be no individual attribution to any survey response. SRI as the survey administrator will maintain the confidentiality of all respondents. Any survey data provided to anyone outside of SRI, including NSF or the NSB, will be purged of information that could be used to identify individual responses.”


  1. QUESTIONS OF A SENSITIVE NATURE

No questions of a sensitive nature will be asked.


  1. ESTIMATE OF BURDEN

Each respondent will submit only one survey response. It is anticipated that the average response time will be fifteen minutes. This estimate is based on the survey length, feedback from NSF staff who reviewed and tested the survey, and time required to complete similar surveys in the past. NSF estimates the number of responses to the survey to be 3,200 for an annual burden of 800 hours.


  1. ANNUALIZED COST TO RESPONDENTS

In March 2010, The Chronicle of Higher Education published a table of average faculty salaries by field and rank at 4-year colleges and universities for the academic year of 2009-2010. This data was collected by the College and University Professional Association for Human Resources. The salaries for professors, associate professors, and assistant professors were averaged together for the following fields: biological and biomedical sciences, education, engineering, engineering technologies/technicians, library science, mathematics and statistics, physical sciences, and science technologies/technicians.


Annualized Cost to Respondents

Average salary of Faculty as described above

$70,086

Hourly Salary Based on 1,560 Annual Hours (40 hours per week for 39 weeks)

$44.93

Estimate of Survey Burden

800 hours

Annualized Cost to Respondents

$35,944



  1. CAPITAL/STARTUP COSTS

Not applicable.


  1. ANNUALIZED COST TO THE FEDERAL GOVERNMENT

The table below estimates the annualized cost to the government associated NSF 2011 Survey of Principal Investigators and Reviewers. Costs include contractor support and the participation of federal government employees. Federal employee hourly rate was calculated from OPM’s Table NO 2011-ES for salaries effective January 2011. The average of the SES for Agencies with a Certified SES Performance Appraisal System was used. The total annualized cost is estimated at $107,010.



Annualized Cost to the Federal Government

Contractor support for survey design, administration, and data analysis

$100,000

Hourly salary of federal government employee (SES Level IV)

$72

Hours, federal government employee review and oversight

100

Cost of federal government employee review and oversight

$7,200

Annualized Cost to the Federal Government

$107,200




  1. CHANGES IN BURDEN

There are no changes in burden. This proposed collection fits within the limits of our generic clearance.


18. PUBLICATION OF COLLECTION

The survey results will be available on the NSF Web site (www.nsf.gov) in Summer 2011.


19. SEEKING APPROVAL TO NOT DISPLAY OMB EXPIRATION DATE

Not applicable.


20. EXCEPTION(S) TO THE CERTIFICATION STATEMENT (19) ON OMB 83-I

There are no exceptions.


  1. STATISTICAL METHODS

B.1. Universe and Sampling Procedures

The NSF 2011 Survey of Principal Investigators and Reviewers will use a simple random sample approach to gather information about its target population. The sampling frame for this survey will be generated from NSF lists of PIs who had proposals accepted or declined during FY 2010 and individuals who served as panel or ad hoc reviewers during FY 2010. Since individuals could be on both lists, the contractor generated an unduplicated list of 100,000 individuals who were PIs, and/or Reviewers. A simple random sample of 8,000 of these individuals will be selected. NSF expects a response rate of 40%. The sample size of 8,000 will enable analyses of various subpopulations of interest.

Email addresses for the survey population are available from the lists of PIs and Reviewers that were used to generate the sample frame. This information is submitted by PIs seeking funding from NSF and is therefore very likely to be correct and current. Reviewer contact information is also maintained by NSF and is likely to be current. However, a small and as yet unknown percentage of PI and reviewer email data may be either inaccurate or out of date. The most likely reasons for such exceptions will be PIs moving to new institutions or changing email addresses.


The survey’s administration will include efforts to minimize the primary sources and effects of non-response bias. For purposes of this survey, the primary sources of non-response bias include:

(1) surveys not being delivered to their intended parties;

(2) unwillingness to participate in the survey; and

(3) respondents abandoning the survey before completing all responses.

Each of these areas will be addressed below.


To minimize the potential for undelivered surveys, the survey team will utilize and correct, as possible, information on members of the target population. The information from the NSF database will enable distribution of the online survey instrument to the entire target population with very few exceptions. To minimize undelivered surveys, the survey team will closely monitor the delivery status of all surveys (the online administration of the survey automates such monitoring). For surveys that are not delivered, the survey team will make all reasonable efforts to identify and the correct information required for survey delivery (such as checking for typographical errors, verifying proper email syntax, or consulting academic institution Web sites), and will use this information to re-send the survey to the appropriate parties.


To minimize unwillingness to participate in the survey, the survey team will utilize strategic communications, a respondent-friendly online interface and format, and follow-up reminders (as described in greater detail below in section B.3. Methods to Maximize Response). These efforts will also address the potential tendency of respondents to complete only part of the survey.


As stated above, the target population for the NSF 2011 Survey of Principal Investigators and Reviewers consists of 100,500 individuals. This count represents unique PIs and reviewers within the NSF proposal database who had a submitted a proposal that was awarded or declined in FY 2010 and/or served as a reviewer during that same period. The NSF Offices or Directorates, to which these individuals submitted proposals and/or for which they served as reviewers include the following:


NSF Offices and Directorates Included in This Survey as Research Proposal Recipients

  • Office of International Science and Engineering (OISE)

  • Office of Cyberinfrastructure (OCI)

  • Biological Sciences (BIO)

  • Computer and Information Science and Engineering (CSE)

  • Engineering (ENG)

  • Geosciences (GEO)

  • Mathematics and Physical Sciences (MPS)

  • Social, Behavioral and Economic Sciences (SBE)

  • Education and Human Resources (EHR)

  • Office of Polar Programs (OPP)


During 2009 -2010, these NSF Offices and Directorates reviewed and made award decisions on over 100,000 research proposals. These proposals were submitted and reviewed by individuals in the survey’s target population.


With an expected survey response rate of 40 percent, the anticipated number of completed surveys is 3,200. The following table illustrates this expected rate of response.


Target Population Description and Expected Number of Responses

Target Population

Target Population/Sample Count

Expected Number of
Responses

(PIs) Individuals who submitted proposals to NSF that were awarded or declined in FY 2010 and/or (Reviewers) Individuals who served as panel reviewers or ad hoc reviewers of NSF proposals in FY 2010



100,500 – Population

8,000 – Sample



3,200



B.2. Survey Methodology

The NSF seeks to address the following research questions:

  • What are the advantages and disadvantages of the two criteria?



  • What are the options for the role of the PI’s institutions in meeting the criteria?



  • What impacts(s) have the two criteria had on how scientists think about doing their research?



  • How consistently are the merit review criteria applied by different parties?



  • What are the options for assessing the successes and failures of activities relevant to each criterion as the research is complete and afterwards?



The survey methodology was designed to address these goals through the following steps:

  • Identify the survey objectives. Confirm objectives for the survey, review technical and practical assumptions, and identify internal and external stakeholders to engage in the survey development process.

  • Design the survey. Create a survey that will address the 5 research questions asked by the NSB Task Force and NSF.

  • Administer the survey. Develop and implement a communications strategy that will maximize the survey response rate. Administer the survey between March 3 – March 24, 2011.

  • Analyze survey results. Conduct analyses to summarize the data and identify patterns, trends, and significant relationships.

  • Report survey results. Provide an executive briefing to NSF senior managers and on the high-level survey findings and their implications.

  • Prepare and deliver a final report. Incorporate changes and suggestions from the NSF senior managers into the survey findings and deliver the final report to NSF.


This submission to OMB includes the survey instrument developed in accordance with the process just described. The intent of the survey instrument is to address the goals established by NSF listed above. To accomplish this, several iterations of the instrument were developed and vetted with NSF staff as well as other stakeholders and senior leaders within NSF.


The NSF 2011 Survey of Principal Investigators and Reviewers will be distributed via email to each individual in the target population. The email will include an explanation of the purpose of the survey, assure respondents their responses will not be attributable and their identities will remain confidential, and describe where respondents may obtain survey results once the analysis of results is complete.


B.3. Methods to Maximize Response

To maximize the response rate, the administration of the NSF 2011 Survey of Principal Reviewers will include a letter from the NSF Task Force on Merit Review Leader sent to the survey sample a week prior to the survey distribution. This letter (which will be sent by email) will provide credibility to the survey effort, will explain how the survey will benefit the community (by allowing them to provide feedback so that NSF can continue to improve the proposal experience), and will encourage recipients to complete the survey.


The online format and layout of the survey instrument are other factors that will help maximize the survey response rate. Through pilot testing and iterative feedback from contract and NSF staff, the survey instrument was tested and refined regarding its clarity and usability. Feedback from reviewers indicated that the survey will likely require approximately 15 minutes to complete. This length of time is well within the parameters for avoiding “questionnaire fatigue” and maintaining respondent interest.


The subject of the survey itself will enhance its response rate. NSF is aware of its stakeholders’ interest in NSF’s merit review process. The presumption of NSF, based on anecdotal evidence, has been that survey participants are concerned about the Merit Criteria subject and will welcome an opportunity to share information on their views and experiences on this important issue.


There will be at least one reminder sent to survey recipients who have not yet responded. These reminders will be emailed only to non-respondents. The reminders will take place a few days following a drop-off and leveling from the response spike that typically follows a survey distribution or reminder. These reminders will also help to maximize the survey response rate.


While there will be no tangible reward or incentive to motivate participation in the survey, survey recipients will be informed of how they will be given access to survey results (the survey results will be posted on the NSF Web site after completion and submission of the final analysis of survey response data). This assurance is intended to engender respondent trust and confidence; by participating in the survey, respondents will have a stake in the survey’s outcome and will have access to the final results. Providing this assurance will contribute to maximizing the survey response rate.


B.4. Testing of Procedures

The development of the NSF 2011 Survey of Principal Investigators and Reviewers has involved and will continue to include several processes to test the survey instrument and its administration.


To test the survey questionnaire iterative rounds of feedback were obtained from NSF staff, who reflected a diversity of perspectives and perceptions. Through this feedback, successive refinements to the survey questionnaire were made and its clarity, ease of completion and technical focus were enhanced. Some of the items in this survey questionnaire are very similar to items included in an ongoing web based survey of NSF staff that did not require OMB clearance. In addition to obtaining iterative rounds of feedback from NSF staff on that survey, it was also pretested in its online form before implementation with 6 individuals. We plan to conduct in person pretests of the online version of the survey instrument during the next two weeks with 9 individuals from the target population and make any needed refinements.


The survey administration will involve preliminary testing of target population email addresses to verify incorrect email syntax or other errors that would prevent an otherwise valid email address from being used. This testing will be done through an automated function within the online survey application. After the initial survey distribution, it is expected that some email addresses will “bounce” and be returned because they are incorrect or no longer exist. Reasonable efforts, such as searching online references within an intended respondent’s academic institution, will be made to correct any email addresses that are found to be incorrect or changed.


B.5. Contacts for Statistical Aspects of Data Collection

Contacts for statistical aspects of data collection are Jongwon Park ([email protected]), Thomas Slomba ([email protected]), and Joanne Tornow ([email protected]).


Attachments

NSF 2011 Survey of Principal Investigators and Reviewers: Survey Instrument (hard copy and online address)



9

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
AuthorJFELDMAN
File Modified0000-00-00
File Created2021-02-04

© 2024 OMB.report | Privacy Policy