TIP 2008 Customer Satisifaction Survey Supporting Statement

0693-0031-TIP2008CustSatSurveySupportingStatement6-09.doc

Generic Request for Customer Service-Related Data Collections

TIP 2008 Customer Satisifaction Survey Supporting Statement

OMB: 0693-0031

Document [doc]
Download: doc | pdf

OMB Control No. 0693-0031

NIST Generic Request for Customer Service-Related Data Collections


TIP 2008 Customer Satisfaction Survey


FOUR STANDARD SURVEY QUESTIONS


1. Explain who will be surveyed and why the group is appropriate to survey.


The Technology Innovation Program Customer Satisfaction Survey instrument is designed to contact two distinct groups using a skip pattern to consolidate questions into a single survey instrument.


Respondents will be mailed a notification letter asking them to participate in the survey.  A copy of the notification letter is provided. Respondent will be sent an email directing them to the website along with their password for entering the survey.  The survey is designed to reach those organizations on the TIP mailing list.  The reasons for the survey are as follows:


  1.  To address the efficacy of TIP outreach

  2. Customer satisfaction questions for the conveying of information to potential applicants

  3. Customer satisfaction with the proposal submission and review process


The first group consists of members from the TIP mailing list. Members of this mailing list proactively signed up to receive information and updates from TIP regarding current and future award competitions. The questions for this cohort are designed to answer three broad questions:

  • How effective is TIP outreach representing all eligible organizations for future competitions?

  • How effective is TIP communicating program news and reaching new potential proposers?

  • What portion of the mailing list is likely to propose research and what are the major reasons an organization does not propose research to the program?


The second group consists of organizations named as participants from proposals to the 2008 TIP competition. This group will answer the same questions as the first group and will then answer additional questions based upon their experiences from the 2008 award competition. These additional questions are designed to answer an additional three broad questions:

  • How effective is TIP at selecting proposals consistent with statutory requirements and goals of the program?

  • How do organizations view the proposal preparation and review process (including perceived fairness of the process and preparation burden)?

  • What additional evidence is there for the unique role served by TIP funding?



2. Explain how the survey was developed including consultation with interested parties, pre-testing, and responses to suggestions for improvement.


The survey instrument was developed with the contractor, Westat, who will be responsible for fielding the instrument. The former Advanced Technology Program (ATP) worked with Westat to field three previous customer satisfaction surveys (for the 2000, 2002, and 2004 award competitions). Primary personnel for developing the survey at TIP also worked with the same personnel at Westat in developing the previous ATP surveys. After each fielding, Westat provided a Methods Report which examined potential response-rate bias, effectiveness of questions, variability of responses, etc. Modifications were made based upon the lessons learned from previous fieldings. The previous surveys also included survey methodology research to determine survey mode efficacy in terms of response rate and survey mode influence for open-ended questions.


The respondents will be given 5 email remainders to complete the survey.  The survey will remain in the field for three months to accommodate scheduling conflicts with other duties at the organization.  We have worked with the same contractor before fielding surveys and this protocol has worked very well and is consistent with the anticipated response rate submitted.



3. Explain how the survey will be conducted, how customers will be sampled if fewer than all customers will be surveyed, expected response rate, and actions your agency plans to take to improve the response rate.


The survey will be conducted via the web and there will be no sampling. The cohort for the first group is the entire TIP mailing list. The cohort for the second group is all the organizations listed in proposal (as either a single company or joint venture member) for the 2008 award competition. Given the low respondent burden for the first cohort and experience from three previous ATP customer satisfaction surveys, we anticipate a response rate of 70 to 75 percent.


During previous fielding of these types of surveys, we have developed a useful protocol for reminders and will use these lessons in achieving a high response rate. Respondents will receive a pre-notification letter, an email invitation, and then a 1st and 2nd email reminder. Respondents will be called after the 2nd email request is sent, to remind them that their responses are important to us. Copies of the email reminders are attached.



4. Describe how the results of the survey will be analyzed and used to generalize the results to the entire customer population.


Since the entire population will be surveyed, there will be no need to generate weights from sample respondents to generalize the results. The survey results will be used for both program evaluation and program improvements. The analysis will address the six broad questions listed in Question 1 of this response. These six questions were grouped by respondent population for Question 1 but can be thought of as addressing three topical areas:

  • Communication and outreach

  • Proposal solicitation and proposer burden

  • Measuring to mission


Future success of the program will require effectively engaging and communicating with organizations that may become future proposers. TIP will need to understand the extent to which new organizations are being reached and whether past communications are being received and result in action by potential proposers.


TIP is interested in engaging organizations who can contribute cutting-edge research to address our nation’s critical national needs. TIP needs to reach potential new partners while balancing two other issues related to burden. TIP does not want to engage organizations whose research agenda is unlikely to meet TIP’s requirements regarding high-risk research. TIP is also sensitive to the time and money burden associated with preparing a proposal. While TIP will always require that organizations submit proposals that allow the program to carry out its due diligence in awarding taxpayer money, the survey results can be used to mitigate administrative burden associated with submitting proposals.


TIP is committed to using data to verify the extent to which it is meeting its statutory mission. Several questions are designed to help analyze the efficacy of the TIP selection process in meeting the program’s mission.

3


File Typeapplication/msword
AuthorDarla Yonder
Last Modified Bydyonder
File Modified2009-06-23
File Created2009-06-23

© 2024 OMB.report | Privacy Policy