0920_0768_2010_NPIN_Supporting_Statement B

0920_0768_2010_NPIN_Supporting_Statement B.doc

Outcomes Data Collection of the National Prevention Information Network

OMB: 0920-0768

Document [doc]
Download: doc | pdf









Part B:


Outcomes Data Collection of the National Prevention Information Network (OMB Control No. 0920-0768)









Contact Information:


F. E. Harrison, MBA, PMP

National Prevention Information Network Services (NPIN) Project Officer

National Center for HIV, Viral Hepatitis, STD, and TB Prevention/CDC

1600 Clifton Rd NE, MS E-07

Atlanta, GA 30333

404-639-6095

404-639-8910(fax)

­­­[email protected]



January 7, 2011



  1. Collections of Information Employing Statistical Methods


B.1. Respondent Universe and Sampling Methods


To assess the NPIN User’s satisfaction with the Web site, products and services, two different surveys have been conducted. One survey has been administered to users of the NPIN Web site (i.e., NPIN Web site User Survey) and the other survey has been administered to users of NPIN products and services (i.e., NPIN Products and Services User Survey). These groups of users, however, are not mutually exclusive.


The analysis of survey responses from both surveys will consist primarily of descriptive statistics (i.e., frequency and percentage distributions, graphics, and cross tabulations) that describe the level of participants’ satisfaction with the quality of the NPIN Web site, and products and services. Cross tabulations will be conducted to determine whether there are differences in the level of participant satisfaction based on background information (e.g., organization affiliation and target population served).


NPIN Web site User Survey

The new NPIN Web site User Survey will be made available on the NPIN site, through the mailing lists, and through NPIN’s social networks (e.g., Twitter). Data collected will be analyzed every six months. If the number of respondents reaches 500 any time within a year, which is unlikely based on our experience in the past two years, the survey will be deactivated and will be reactivated the next year.


NPIN Products and Services User Survey

The sampling frame for the NPIN Products and Services User Survey will include all of those organizations who have accessed NPIN services. A blast email will be sent to potential respondents through the mailing list or otherwise disseminated through NPIN’s social networks (e.g., Twitter). As soon as 500 survey voluntary responses are captured, the survey data will be analyzed for that period. In those rare cases where people do not have web access, it will be conducted over the phone.



B.2. Procedures for the Collection of Information


The NPIN Team referenced the American Customer Satisfaction Index in the development of the surveys. The surveys cover the following topics with regards to the NPIN Web site and products and services:

  • Top Tasks – what they came to the Web site to do that day, and whether they were successful in doing it

  • Perceived quality – overall experience with the Web site/products and services, usefulness to the user, reliability of the Web site/products and services

  • Customer expectations

  • Customer satisfaction – overall satisfaction with the Web site/products and services, comparison to other Web sites/products and services

  • User trust – confidence in Web site/products and services

  • Use of Web site or products and services – features used, frequency of use

  • Other sources of information used

  • Suggestions from User – desirable features and content, suggestions for improvement

  • Background information of User – organization type, job/position, target population served


The NPIN Web site User Survey has been functional since 2008. Visitors to the NPIN Web site will continue to voluntarily complete the online survey. A blast email to alert users to the survey will be sent annually.


The NPIN Products and Services User Survey will be conducted biannually for a period of 1-3 months in which potential respondents will be invited to participate in the data collection via a blast email with a link to the survey housed in the Key Survey system.


B.3. Methods to Maximize Response Rate and Deal with Nonresponse


Every effort will be made to ensure a high response rate from survey participants. The surveys are free of undue burden, unambiguous, and easy to complete. In addition, the design and layout of each survey adheres to the general guiding principles of survey design (Shannon et al 2002). Each survey is brief and concise, maximizes the use of closed-ended questions, utilizes clear response options and instructions, and asks for minimal background information about the respondent with no identifying information. Moreover, the proposed data collection includes several proven methods to improve response rates for web-based surveys such as an email cover letter, and for the Product and Services Survey, email follow up reminders and the option to complete the survey by multiple modes (e.g., online or by phone)(Solomon 2001).


NPIN has received some feedback that the Web survey link is difficult to find on the site, so we will make it more prominent.


B.4. Tests of Procedures or Methods to be Undertaken


No new method or procedures are envisioned for the respondents.



B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


Contractors responsible for designing study and collecting and analyzing data:



Melissa Beaupierre

NPIN Project Director

Danya International, Inc.

9 Corporate Boulevard

Suite 100

Atlanta, GA 30329

Ph: 404-679-7908

Email: [email protected]


Jennifer Scherer, Ph.D.

Vice President, Program Management and Evaluation

Danya International, Inc.

8737 Colesville Road

Suite 1100

Silver Spring, MD 20910

Ph: 240-645-1131

Email: [email protected]

Individual responsible for receiving and approving contract deliverables:


F.E. Harrison

NPIN Project Officer

CDC/NCHHSTP/OD

8 Corporate Boulevard

Atlanta, GA 30329

Ph: 404-639-6095

Email: [email protected]


Shannon, D.M., Todd E. Johnson, Shelby Searcy, and Alan Lott. (2002). Using Electronic Surveys: Advice from Survey Professionals. Practical Assessment Research & Evaluation, 8(1). Accessed June 24, 2010 at http://PAREonline.net/getvn.asp?v=8&n=1.


Solomon, D.J. (2001). Conducting Web-based Surveys. Practical Assessment Research & Evaluation, 7(19). Accessed June 24, 2010 at http://PAREonline.net/getvn.asp?v=7&n=19.


The Regents of the University of Michigan. American Customer Satisfaction Index (ACSI) Methodology Report. 2005.

4


File Typeapplication/msword
File TitleAn Extension with Change Request for Information Collection for the Outcome Evaluation of the National Prevention Information
Authorbzheng
Last Modified ByBaoyi
File Modified2011-01-27
File Created2011-01-27

© 2024 OMB.report | Privacy Policy