Standardized Methods - QUIS

0693-0043-ITL-CSD-PasswordGeneration-StandarizedMethods.doc

NIST Generic Clearance for Usability Data Collections

Standardized Methods - QUIS

OMB: 0693-0043

Document [doc]
Download: doc | pdf

Questionnaire for User Interaction Satisfaction

What is the QUIS?

The Questionnaire for User Interaction Satisfaction (QUIS) is a measurement tool designed to assess a computer user's subjective satisfaction with the human-computer interface. It was developed at the Human-Computer Interaction Laboratory (HCIL), University of Maryland at College Park. The QUIS contains a demographic questionnaire, a measure of overall system satisfaction, and a measure of specific interface factors such as screen visibility, terminology and system information, learning factors, and system capabilities.

Who uses the QUIS?

The QUIS is used at both academic and industrial sites to evaluate systems and software. What makes the QUIS such a good tool?...It has been proven both reliable and valid by J. P Chin, V. A. Diehl, and K. L. Norman (1988). It is one of the few available quantitative measures of user satisfaction that doesn't require expensive performance testing. The QUIS can also be used to test before and after changes are made to a system in order to quantify the magnitude of improvements.

About the QUIS (http://www.lap.umd.edu/QUIS/about.html)

The Questionnaire for User Interaction Satisfaction (QUIS) is a tool developed by a multi-disciplinary team of researchers in the Human-Computer Interaction Lab (HCIL) at the University of Maryland at College Park. The QUIS was designed to assess users' subjective satisfaction with specific aspects of the human-computer interface. The QUIS team successfully addressed the reliability and validity problems found in other satisfaction measures, creating a measure that is highly reliable across many types of interfaces.

The QUIS 7.0 contains a demographic questionnaire, a measure of overall system satisfaction along six scales, and hierarchically organized measures of eleven specific interface factors (screen factors, terminology and system feedback, learning factors, system capabilities, technical manuals, on-line tutorials, multimedia, voice recognition, virtual environments, internet access, and software installation). Each area measures the users' overall satisfaction with that facet of the interface, as well as the factors that make up that facet, on a 9-point scale. The questionnaire is designed to be configured according to the needs of each interface analysis by including only the sections that are of interest to the user.


WWW Sites

http:// www.lap.umd.edu/QUIS/index.html /* overview, example questions */

http:// www.lap.umd.edu/QUIS/references.html /* semi-promotional article */

QUIS-related references:

Some of these papers are available on-line.

Chin, J. P., Diehl, V. A. and Norman, K. L. (1988). Development of an instrument measuring user satisfaction of the human-computer interface. Proceedings of SIGCHI '88, (pp. 213-218), New York: ACM/SIGCHI.


Chin, J. P., Norman, K. L., and Shneiderman, B. (1987). Subjective user evaluation of CF PASCAL programming tools. Technical Report (CAR-TR-304). College Park, MD: Human-Computer Interaction Laboratory, Center for Automation Research, University of Maryland.


Harper, B. D. and Norman, K. L. (1993). Improving User Satisfaction: The Questionnaire for User Interaction Satisfaction Version 5.5. Proceedings of the 1st Annual Mid-Atlantic Human Factors Conference, (pp. 224-228), Virginia Beach, VA.


Sample Questions


User Evaluation of an Interactive Computer System

(For each of the following questions, fill in 0-9 or leave blank if question is not applicable)

Skip question if not applicable


OVERALL REACTIONS TO THE SOFTWARE

terrible wonderful

0 1 2 3 4 5 6 7 8 9

difficult easy

0 1 2 3 4 5 6 7 8 9

frustrating satisfying

0 1 2 3 4 5 6 7 8 9

inadequate power adequate power

0 1 2 3 4 5 6 7 8 9

dull stimulating

0 1 2 3 4 5 6 7 8 9

rigid flexible

0 1 2 3 4 5 6 7 8 9


SCREEN

Characters on the computer screen

hard to read easy to read

0 1 2 3 4 5 6 7 8 9

Highlighting on the screen simplifies task

not at all very much

0 1 2 3 4 5 6 7 8 9

Organization of information on screen

confusing very clear

0 1 2 3 4 5 6 7 8 9

Sequence of screens

confusing very clear

0 1 2 3 4 5 6 7 8 9


TERMINOLOGY AND SYSTEM INFORMATION

Use of terms throughout system

inconsistent consistent

0 1 2 3 4 5 6 7 8 9

Computer terminology is related to the task you are doing

never always

0 1 2 3 4 5 6 7 8 9

Position of messages on screen

inconsistent consistent

0 1 2 3 4 5 6 7 8 9

Messages on screen which prompt user for input

confusing clear

0 1 2 3 4 5 6 7 8 9

Computer keeps you informed about what it is doing

never always

0 1 2 3 4 5 6 7 8 9

Error messages

unhelpful helpful

0 1 2 3 4 5 6 7 8 9


LEARNING

Learning to operate the system

difficult easy

0 1 2 3 4 5 6 7 8 9

Exploring new features by trial and error

difficult easy

0 1 2 3 4 5 6 7 8 9

Remembering names and use of commands

difficult easy

0 1 2 3 4 5 6 7 8 9

Tasks can be performed in a straight-forward manner

never always

0 1 2 3 4 5 6 7 8 9

Help messages on the screen

unhelpful helpful

0 1 2 3 4 5 6 7 8 9

Supplemental reference materials

confusing clear

0 1 2 3 4 5 6 7 8 9


SYSTEM CAPABILITIES

System speed

too slow fast enough

0 1 2 3 4 5 6 7 8 9

System reliability

unreliable reliable

0 1 2 3 4 5 6 7 8 9

System tends to be

noisy quiet

0 1 2 3 4 5 6 7 8 9

Correcting your mistakes

difficult easy

0 1 2 3 4 5 6 7 8 9

Experienced and inexperienced users' needs are taken into consideration

never always

0 1 2 3 4 5 6 7 8 9


USABILITY AND UI

Use of colors and sounds

poor good

0 1 2 3 4 5 6 7 8 9

System feedback

poor good

0 1 2 3 4 5 6 7 8 9

System response to errors

awkward gracious

0 1 2 3 4 5 6 7 8 9

System messages and reports

poor good

0 1 2 3 4 5 6 7 8 9

System clutter and UI “noise”

poor good

0 1 2 3 4 5 6 7 8 9



File Typeapplication/msword
File TitleQuestionnaire for User Interaction Satisfaction
AuthorComputer Science
Last Modified ByYonder, Darla
File Modified2013-12-03
File Created2013-12-03

© 2024 OMB.report | Privacy Policy