Memo/Justification for 0925-0589-02

Memo_0925-0589-02_DCCPS ARP Survey_3-12-09.doc

Questionnaire Cognitive Interviewing and Pretesting (NCI)

Memo/Justification for 0925-0589-02

OMB: 0925-0589

Document [doc]
Download: doc | pdf


Date: March 12, 2009


TO: Office of Management and Budget (OMB)


Through: Marilyn Tuttleman, NIH Project Clearance Officer, OPERA

Vivian Horovitch-Kelley, NCI OMB Project Clearance Liaison Office


From: Cheryl Burg, Project Manager

Gordon Willis, Cognitive Psychologist

Division of Cancer Control and Population Sciences (DCCPS),

National Cancer Institute/NIH


Subject: Questionnaire Cognitive Interviewing and Pretesting (ARP/DCCPS/NCI)

OMB No. 0925-0589, Expiration Date 1/31/2010

Proposed Generic Sub-Study #2 “Pilot of a Survey Instrument Designed to Assess Users’ Perceptions of Website Changes”

(OMB No. 0925-0589-02)


To ensure optimal usability of website designs and redesigns that are being developed at the National Cancer Institute (NCI) on a regular basis, evaluation of the final design is a critical though often overlooked step. Part of the reason for this is that existing survey instruments for measuring user satisfaction or usability of active websites require significant user effort to complete. The purpose of the proposed study is to design and test a new survey instrument to assess user perceptions of specific design changes to an existing website. The instrument will allow users to make a very quick, direct assessment of the site and interaction design.


Background on Project


New website designs and website redesigns are being developed at the National Cancer Institute (NCI) on a regular basis. Typically in a user-centered process, users are interviewed, the existing design (if available) is evaluated, and designers create a new design based on the available data. A usability evaluation may be conducted on a prototype of the design, and changes are often recommended after this evaluation. Once the website is launched, feedback on the final design is rarely collected.


Several challenges impact the process of evaluating post-launch websites. First, there is often lack of funding necessary to design a specialized evaluation instrument. Second, actual users are often not known, particularly when the site is a public-facing site as many of the NCI sites are. Third, sample size must be sufficient for analysis.


There are several existing survey instruments that have been used to measure user satisfaction or usability of active websites. For instance, The ACSI (American Customer Satisfaction) survey has been used by government agencies for this purpose. Standardized survey instruments such as the SUS (System Usability Scale), QUIS (Questionnaire for User Interaction Satisfaction), SUMI (Software Usability Measurement Inventory) and WAMMI (Website analysis and Measurement Inventory) exist for assessing designs.


Typically, however, these instruments are time-consuming for users to complete, resulting in low completion rates when participation is voluntary. In order to ensure a response rate high enough to allow conclusions to be drawn, the survey should be brief, for instance 4-5 questions in length, so that a user could complete it in two to three minutes. To our knowledge, such a survey is not in common usage.


A pop up survey would be a practical way to obtain user feedback from a large audience of users.


Pop Up Surveys


Pop up surveys appear as a new browser window when users are accessing a website. Users are asked whether they would like to participate in a brief survey, and if they agree, the survey questions appear in the window. If users decline to participate, the window closes, and users return to the website. Advantages of this methodology include direct targeting to actual users who are visiting the website and operation while the site design is fresh in the participant’s mind.


Proposed Research


The proposed study, “Pilot of a Survey Instrument Designed to Assess Users’ Perceptions of Website Changes,” is to design and test a new survey instrument to assess user perceptions of specific design changes to an existing website. The survey instrument will allow current users who are familiar with both the old and new site design to provide a very quick, direct assessment of the design changes. All contact with users will be through the pop-up survey; at no time will there be any verbal probing of users. The survey questions will need to address whether the new website better meets the needs of the users in terms of finding and using information (Refer to Attachment 1).


A key requirement of the survey instrument is that it will produce results that are consistent with other forms of evaluation. We will get an indication of the degree of consistency through comparison with results from other evaluation methodologies.


The website for the DCCPS Applied Research Program (ARP) (see Figure 1) is currently being redesigned and developed using user-centered design principles, which provides an opportunity to test the new survey.


Figure 1: Applied Research Website, current design


For this specific use, results of the survey will allow the NCI to better understand user perceptions of usefulness and usability of the new presentation of Applied Research Program information. It is planned that visitors to the new ARP website will be invited to participate in the study until 100 persons have completed the survey. We believe that this approach will provide sufficient data for analysis. Survey participants will be anonymous visitors to the ARP website.


In the larger sense, the survey will provide NCI with a new tool to use in assessing redesigned websites.


Estimates of Hour Burden and Respondent Cost

Types of Respondents

Number of Respondents

Frequency of Response

Average Time Per Response (Minutes)

Annual Hour Burden

(Hours)

Website users

100

1

3

5


Thank you for your consideration of this proposed sub-study #2.

File Typeapplication/msword
File TitleMay 17, 1999
AuthorSusan Yurgalevitch
Last Modified ByVivian Horovitch-Kelley
File Modified2009-03-12
File Created2009-03-09

© 2024 OMB.report | Privacy Policy