Form CMS-10588 Usability Test Script

Generic Clearance for the Collection of Qualitative Feedback on Agency Service Delivery

QNP-PMBR-UsabilityTest-Artifacts

(CMS-10588) Usability Testing and Evaluation for Phase 1 of the QualityNet Portal (QNP) Redesign Project

OMB: 0938-1185

Document [pdf]
Download: pdf | pdf
Edaptive Systems
400 Red Brook Blvd, Suite 120
Owings Mills, MD 21117
Phone: 410-327-3366
Fax: 410-327-0612
www.edaptivesys.com

Usability Test Script

Welcome and Purpose
Hi [test participant first name]. I’m [name] and I work with [insert relevant information].

Thank you so much for your time today.

Before we get started, I want to give you a little information about what you will be looking at
and give you time to ask any questions you might have.

Today we are asking you to serve as an evaluator of a web site and to complete a set of tasks.
The goal is to see if the site works as intended. The session should last about [insert
timeframe].

As you use the site, I’m going to ask you as much as possible to think out loud: try to say what
you’re looking at, what you’re trying to do and what you’re thinking. This will be really helpful to
us. Please don’t worry about hurting our feelings. We’re trying to improve the site, so we need
your honest reactions.

I’d like to make it clear that we’re testing the site and not you. There is no right or wrong answer.

If you have any questions while you’re working, please let me know and I’ll do my best to
answer. Because we’re interested in how someone does something without any assistance, I
may not be able to answer your question immediately. If you still have questions at the end, I’ll
do my best to answer them. Also, if you need to take a break at any time, just let me know.

Recording Permission
I’m here to guide you through the tasks that you’ll go through on the web site today. [If
appropriate include] I have a colleague helping me take notes and observe your interaction with
the site as well.

[Use if testing is in person.]
You may have noticed the microphone. With your permission, we’re going to record what
happens on the screen and our conversation. The recording will only be used to help us figure
out what needs to be improved with the site, and it won’t be seen by anyone except for the
people working on this project.

If you would, I’m going to ask you to sign a simple permission form. It says that you allow us to
record the session, and that we’ll only share it with the people on the project team.

[Use if testing is conducted via recorded webex.]
With your permission, we’re going to record what happens on the screen and our conversation
using the recording feature on the webex. The recording feature will only be used to help us
figure out what needs to be improved with the site, and it won’t be seen by anyone except for
the people working on this project.

I know that we emailed you a recording permission form to complete, and I wanted to confirm
that I’ve received that back from you.

Facilitator Tasks:
• Accept permission form
• Begin recording session

Introductory Questions
Do you have any questions before we begin?

Great! Before we move over to the web site, I’d like to ask you a few quick questions:
•

[Insert any questions relevant/helpful to usability test]

Thanks. We’re done with the questions, and we can move on.

2

Task 1
I’m going to take you through some specific tasks. I’m going to read them out loud and also
given you a written version.

I’m going to also ask you to go through these tasks without using search. We’ll learn a lot more
about how the site works that way.

[Read task aloud and have participant complete.]

Follow-up Questions and Closing
Thanks. That was very helpful. We really appreciate your time today.

Were there any additional questions that you had after completing the tasks that you just went
through?

[Insert any information about an incentive, if being provided.]

Thanks again and enjoy the rest of your day.

Facilitator Tasks:
• Stop recording and end webex session (if using)
• Escort participant out (if in person)

3

Edaptive Systems
400 Red Brook Blvd, Suite 120
Owings Mills, MD 21117
Phone: 410-327-3366
Fax: 410-327-0612
www.edaptivesys.com

Consent Form: Remote Usability Test
Please read and sign this form.
During this usability test I agree to participate in an online session using my computer and telephone.
During the session I will be interviewed about the site, asked to find information or complete tasks using
the site, and may be asked to complete an online questionnaire about the experience.
I understand and consent to the use and release of the recording by [Agency/Organization]. I understand
that the information and recording are for research purposes only and that my name and image will not be
used for any other purpose. I relinquish any rights to the recording and understand the recording may be
copied and used by [Agency/Organization] without further permission.
I understand that participation is voluntary and I agree to immediately raise any concerns I might have.
If you have any questions after today, please contact [Insert Contact and email address].
Please sign below to indicate that you have read and understand the information on this form and that
any questions you might have about the session have been answered.

Date:_________
Please print your name: ____________________________________________________
Please sign your name: ____________________________________________________
Subject's Signature or eSignature 
Thank you!
We appreciate your participation.

Please return the signed document to [email or physical address].

Test: (Site name)

__/__/__ to __/__/__

Usability Test Note Taking Spreadsheet
This spreadsheet is for taking notes during usability tests. It contains the following worksheets:

Worksheet

Description

1. Pre-test interview

Background questions about the participant to determine user role/type.
These questions can mirror the questions asked to recruit the
participant. Can also ask additional questions such as technology
use/experience.
Scenarios and information for the participant to do during the test,
includes place for qualitative, quantitative and other types of notes like
time on task.
Definitions of success, reference for note takers to consult during the
test.
Questions administered after the test.
System usability scale, a standardized way of scoring the usability of a
website or application

2. Scenarios

3. Success Criteria
4. Post-test interview
5. System Usability Scale

POST-test questions

Pre-Test Questions
Q1 [TEXT]
P1

P2

P3

P4

P5

P6

P7

P8

P9

P10

Q2 [TEXT]

Q3 [TEXT]

Q4 [TEXT]

Success Criteria for Scoring Scenarios
Success

Completes the task with minimal effort (must include all of the following)
·
Reaches destination within 2 attempts
·
Does not receive hints from the facilitator
·
Does not ask for help
·
Does not encounter error messages
·
Does not mention frustration

Partial Success

Completes the task with moderate effort (can include any of the following)
·
·
·
·
·

Reaches destination within 3 attempts
Receives 1 hint from the facilitator
Encounters 1 or 2 error messages
Has to back up or reenter information
Mentions minor frustration or expresses minor confusion

Failure

Can include any of the following:
·
Does not complete the task or completes the task with considerable effort
·
Reaches destination in 4 or more attempts
·
Receives 2 or more hints from the facilitator
·
Encounters more than 2 error messages or same error message more than once
·
Has to back up or re-enter information several times
·
Mentions serious frustration or confusion
·
Mentions they would have to call or speak with someone to complete the task
·
Concludes the task is completed successfully, when it is not.

Skip

Task was skipped due to time constraints or because the task was not meant for a particular
user type

Attempts:

An attempt is defined as a one pathway or particular effort to find information. Signs that the
participant is attempting the task again include:
·
Using the back button
·
Starting over from home
·
Verbal mentions of starting over, such as “this isn’t what I’m looking for” “I’m not finding
what I need” “I’m not sure if this is the right way to go” etc. Then followed by an change in
navigational or searching strategies.

AW Final Benchmark Spreadsheet

1

[NAME OF TEST]
[DATE]
Task ID

1

Task Name

Task name

Task

P#

Scenario wording
P1
P2
P3
P4
P5
P6
P7
P8
P9
P10

Time

Verbal comments

Pathway (what they do)

Attempts

Score

Other notes

Post-Test Questions
Q1 [TEXT]
P1

P2

P3

P4

P5

P6

P7

P8

P9

P10

Q2 [TEXT]

Q3 [TEXT]

System Usability Scale
Use the table below to help you calculate the SUS score for each participant. A line has been filled out as an example.
Reference: http://www.measuringusability.com/sus.php

User:
P1
P2
P3
P4
P5
P6
P7
P8
P9
P10

1

2

I think that I
would like to
use this
application
frequently

I found this
application to
be
unnecessarily
complex

4

1
2
3
4
5

SUS#
3
-1
-1
-1
-1
-1
-1
-1
-1
-1

2

Key
strongly disagree
disagree
neutral
agree
strongly agree

SUS#

3
5
5
5
5
5
5
5
5
5

3

4

5

I thought this
application was
easy to use

I think that I
would need the
help of a
support person
to use this
application

I found the
various
functions in
this app were
well integrated

3

SUS#

2
-1
-1
-1
-1
-1
-1
-1
-1
-1

2

SUS#

3
5
5
5
5
5
5
5
5
5

4

6

SUS#

3
-1
-1
-1
-1
-1
-1
-1
-1
-1

I thought there
was too much
inconsistency

2

7

8

9

I would imagine
that most
people would
I found this
learn to use
application very
this application
cumbersome to
use
SUS#
SUS# very quickly SUS#

3
5
5
5
5
5
5
5
5
5

4

3
-1
-1
-1
-1
-1
-1
-1
-1
-1

2

3
5
5
5
5
5
5
5
5
5

I felt very
confident
using this
application

3

10

SUS#

2
-1
-1
-1
-1
-1
-1
-1
-1
-1

I needed to
learn a lot of
things before I
could get
going with this
application
SUS# SUS Score

2

3
5
5
5
5
5
5
5
5
5

70.0
50.0
50.0
50.0
50.0
50.0
50.0
50.0
50.0
50.0


File Typeapplication/pdf
Author[email protected]
File Modified2015-10-26
File Created2015-10-26

© 2024 OMB.report | Privacy Policy