April 27, 2009
NOTE TO THE REVIEWER OF: |
OMB CLEARANCE #1220-0141 “Cognitive and Psychological Research”
|
FROM: |
Jean Fox Research Psychologist Office of Survey Methods Research
|
SUBJECT: |
Submission of Materials for the “Usability Test” Study
|
Please accept the enclosed materials for approval under the OMB clearance package #1220-0141 “Cognitive and Psychological Research.” In accordance with our agreement with OMB, I am submitting a brief description of the study.
The total estimated respondent burden hours for this study is 20 hours.
If there are any questions regarding this project, please direct them to Jean Fox at [email protected] or (202-691-7370).
This proposed study will support a series of research studies aimed at better understanding the process of usability testing. The overall effort is being coordinated by Rolf Molich of DialogDesign in Denmark (see a description of the Comparative Usability Evaluation (CUE) work at http://www.dialogdesign.dk/cue.html). In each effort, usability professionals conduct usability tests on a pre-determined website. The professionals report their results, and Molich, along with other leaders, evaluate and compare the submissions. Over the years, these efforts have enhanced and expanded our understanding of usability testing.
In the initial 1998 study, the four usability test efforts found quite different results for the same website (Molich, et al., 1998). Since then, Molich has explored the variables that may affect the types of problems found, such as the amount of interaction with developers and other stakeholders.
I have been invited to participate in the latest evaluation effort, which will occur at a workshop in conjunction with the Usability Professionals’ Association (UPA) annual meeting in June. This workshop will focus on measures of usability, including task times, completion rates, and satisfaction measures. Each participant will collect data on exactly the same tasks on the same commercial car rental site. The organizers are asking workshop participants to conduct the usability test between April 27 and May 15 if possible. Then, during the workshop, the participants will determine how similar the results are, and will consider possible explanations if the measures are different.
To prepare for the workshop, each participant must conduct a usability test and submit the report to the workshop organizers, along with a video of three sessions. At the conference, participants will review the contributions and try to identify patterns in each of the test results. BLS participation in the workshop will allow us to contribute to the advancement of usability testing and provide a unique opportunity to get input from other usability professionals on the methods used at BLS.
In addition to the evaluation for the UPA workshop, I would also like to address the rating scales used in usability tests. It is common to ask participants to rate their opinion with a series of agree-disagree statements. However, some research suggests that respondents may tend to agree with most statements, regardless of what the statement is (Krosnick, 1999). Therefore, this study will look at two alternative methods for presenting usability questions.
One solution is to use response labels that directly reflect the construct you want to measure (e.g., “easy” to “difficult,” rather than “agree” to “disagree”). Another method, suggested by Krosnick and Berent (1993), is to use branched questions, asking for respondents’ overall opinions first (e.g., positive or negative), then for the magnitude of their opinions (e.g., very positive, moderately positive, somewhat positive). Krosnick and Berent note that branched questions can lead to a greater variety in responses (people are more likely to select extremes) and faster response times. This proposed study will explore these methods to obtain subjective usability ratings.
The workshop organizers have prescribed the website and the tasks to use for the usability test. The website is a commercial car rental website. Although this website is not a BLS product, the goal of this research is to advance the field of usability. The tasks are typical activities a user would complete in the process of renting a car or finding out more information about a car rental (see Appendix A; these tasks may be revised slightly by the workshop coordinators).
At the start of the usability test, I will inform the participants that this study is part of a larger effort and that BLS is not actually working on a commercial website. I will explain that the goals of the study are to understand the process of usability testing better by comparing results from different labs.
Then, each participant will be asked to complete an informed consent form (see Appendix B) and a video release form (see Appendix C). Participants do not need to complete the video release form to participate. The UPA workshop coordinators have asked for video from three participants to evaluate in more detail the methods used. I will only submit video from participants who have signed the video release form.
Then the participants will complete the five tasks. After each task, participants will complete an ease of use rating (from easy to difficult on a 7-point scale). During the session, I will collect usability measures, such as task times and success rates, along with a count of errors made.
After completing all the tasks, participants will complete the System Usability Scale (SUS), a standard 10-item, post-test subjective rating questionnaire (Brooke, 1996, see Appendix D). Next, participants will complete several open-ended questions about the website.
Next, the participants will complete the additional exploratory ratings (also in Appendix D). There will be ten semantic differential items with construct-specific labels for the ten SUS items, and six branching questions (not all the SUS questions could be easily converted to branched questions). Ten participants (half) will receive the semantic differential questions first, and the other ten will receive the branched questions first (all participants will complete the SUS ratings first, to maintain unbiased data for the UPA workshop). Finally, participants will respond verbally to several open-ended questions about each format. Order of presentation will be considered in interpreting these results. Details of the protocol are given in Appendix E.
There will be 20 participants, recruited from our database of participants. I plan to recruit participants who have rented a car within the last year.
Each test session is estimated to last about an hour per participant. Therefore, we estimate that the total burden hours will be 20 hours.
Each participant will receive $40 for their participation.
Participants will be informed of the voluntary nature of the study. Participants will also be informed that the study will be used to evaluate the measures collected in the usability test as part of a larger, cross-organization effort. Participants will be given a consent form (Appendix B) and a video recording release form (Appendix C) to read and sign. Written reports related to this study will not be released to the public in any way that would allow identification of individuals except as prescribed under the conditions of the Privacy Act Notice.
Appendix A: |
Tasks |
Appendix B: |
Consent Form and Privacy Act Statement |
Appendix C: |
Video Release Form |
Appendix D: |
Post-Test Questions |
Appendix E: |
Protocol |
Brooke, J., (1996). SUS - A quick and dirty usability scale. In P. W. Jordan, B. Thomas, B. A. Weerdmeester and I. L. McClelland (eds) Usability Evaluation in Industry, London: Taylor & Francis, pp. 189-194. Available at www.usabilitynet.org/trump/documents/Suschapt.doc
Krosnick, J.A. (1999). Survey research. Annual Review of Psychology, 50, 537-567.
Krosnick, J.A. and Berent, M.K. (1993). Comparisons of party identification and policy preferences: The impact of survey question format. American Journal of Political Science, 37(3), 941-964.
Molich, R., Bevan, N., Curson, I., Butler, S., Kindlund, E., Miller, D., and Kirakowski, J. (1998). Comparative evaluation of usability tests. Proceedings of the Usability Professionals’ Association Annual Meeting. Washington, DC.
Rent an intermediate size car (2 or 4 door, not an SUV) at Logan Airport in Boston, Massachusetts.
Rental start Thursday 11 June 2009 at 09.00 am.
You plan to return the car four days later on Monday 15 June at 3.00 pm.
Use the “Budget.com” rate. Do not select the “Pay Now & Save up to 40%!” option.
Include Loss Damage Waiver (LDW) protection. Do not include any other protections.
If asked for a name, use John Smith, who is a US resident.
If asked for an email address, use [email protected].
Stop when you are ready to submit the reservation. Do not submit the reservation.
Record the price of the rental.
How much does it cost to rent an economy size car in Myrtle Beach, South Carolina, over the weekend?
Rental start Friday 19 June 2009 at 3.00 pm.
You plan to return the car two days later on Sunday 21 June at 7.00 pm.
Use the “Budget.com” rate. Do not select the “Pay Now & Save up to 40%!” option.
Include Loss Damage Waiver (LDW) protection. Do not include any other protections.
Record the price of the rental.
Let’s assume that you are staying at the Hilton Hotel in downtown Portland, Oregon, USA. Find the address of the nearest Budget rental office.
The Hilton Hotel address is 921 SW Sixth Avenue, Portland, Oregon, United States 97204
Record the address of the rental location
What are the opening hours of the Budget office in Great Falls, Montana, on a Tuesday?
Record the hours.
Let’s assume that you have rented an intermediate size car from Budget in Orlando International Airport, Florida. Your rental includes LDW (Loss Damage Waiver). An unknown person has scratched the car seriously in several places, probably with a knife. A mechanic has roughly estimated that the repair will cost around 2,000 USD. Are you liable for the repair costs? If so, approximately how much are you liable for?
Record the response.
The Bureau of Labor Statistics (BLS) is conducting research to evaluate the methods we use to improve the quality of BLS surveys. This study is intended to suggest ways to improve our usability methods. In this study, you will complete several tasks using an online commercial website. We will collect data about your interaction, such as task times and ease of use ratings, and use aggregated data in the analysis. The goal of the study is to evaluate how these measures help identify usability problems and improvements. Other organizations are participating in this work as well.
The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent. The Privacy Act notice on the back of this form describes the conditions under which information related to this study will be used by BLS employees and agents.
During this research you may be audio and/or videotaped, or you may be observed. If you do not wish to be recorded, you still may participate in this research.
We estimate it will take an average of one hour to participate in this research.
Your participation in this research project is voluntary, and you have the right to stop at any time. If you agree to participate, please sign below.
Persons are not required to respond to the collection of information unless it displays a currently valid OMB control number. OMB control number is 1220-0141, and expires February 29, 2012.
------------------------------------------------------------------------------------------------------------
I have read and understand the statements above. I consent to participate in this study.
___________________________________ ___________________________
Participant's signature Date
___________________________________ ___________________________
Participant's printed name Researcher's signature
OMB Control Number: 1220-0141
Expiration Date: 2/29/2012
In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this study is sponsored by the U.S. Department of Labor, Bureau of Labor Statistics (BLS), under authority of 29 U.S.C. 2. Your voluntary participation is important to the success of this study and will enable the BLS to better understand the behavioral and psychological processes of individuals, as they reflect on the accuracy of BLS information collections. The BLS, its employees, agents, and partner statistical agencies, will use the information you provide for statistical purposes only and will hold the information in confidence to the full extent permitted by law. In accordance with the Confidential Information Protection and Statistical Efficiency Act of 2002 (Title 5 of Public Law 107-347) and other applicable Federal laws, your responses will not be disclosed in identifiable form without your informed consent
The Office of Survey Methods at the Bureau of Labor Statistics occasionally conducts training and research projects on data collection methods such as cognitive interviews, focus groups and usability testing. By signing this form you agree to have your image and voice recorded during the study. You relinquish any rights to the recording and understand the recording may be copied and used by the Bureau of Labor Statistics for research-related activities and presentations without further permission. You will not be identified by name.
If you prefer not to release your recording, simply return the form without a signature. Refusing to sign this form will not affect your participation in the study.
Signed _________________________________________ Date ___________
At the end of the session, participants will rate their agreement with the following statements.
1. I think that I would like to use this website frequently .
2. I found the website unnecessarily complex.
3. I thought the website was easy to use.
4. I think that I would need the support of a technical person to be able to use this website.
5. I found the various functions in this website were well integrated.
6. I thought there was too much inconsistency in this website.
7. I would imagine that most people would learn to use this website very quickly.
8. I found the website very cumbersome to use.
9. I felt very confident using the website.
10. I needed to learn a lot of things before I could get going with this website.
In addition to the ratings, participants will be asked a few open-ended questions, including:
What features did you like about the website?
What features didn’t you like about the website?
The home page claims that you can “Rent a car in 60 seconds.” What does this mean to you? Does your experience support that claim?
Do you have any suggestions for improving the website?
Do you have any other comments or suggestions?
These additional ratings are the same 10 qualities as in the SUS scale, but revised so each scale represents the quality of interest, rather than agree-disagree.
When renting a car in the future, I would like to use the website
Frequently ○ ○ ○ ○ ○ Never
The website was
Complex ○ ○ ○ ○ ○ Simple
The website was
Easy ○ ○ ○ ○ ○ Difficult
In using the website, I would
Definitely Need Help ○ ○ ○ ○ ○ Definitely Not Need Help
The website was
Well integrated ○ ○ ○ ○ ○ Disjointed
The website was
Inconsistent ○ ○ ○ ○ ○ Consistent
Most people would
Learn the website quickly ○ ○ ○ ○ ○ Need a long time
to learn the website
The website was
Cumbersome ○ ○ ○ ○ ○ Efficient
In using the website, I was
Very Confident ○ ○ ○ ○ ○ Not at all Confident
In using the website, I
Needed to learn a lot first ○ ○ ○ ○ ○ Got started right away
These additional ratings are the same as six of the SUS items, but revised so each item is a branched scale.
Original Question |
Branched Question |
All participants will be asked this question |
Participants will be asked one of these two questions unless they select “Neutral.” |
Was the website complex or simple?
|
How complex was the website?
How simple was the website?
|
Was the website easy or difficult to use?
|
How easy was the website to use?
How difficult was the website to use?
|
Were the functions of the website well-integrated or disjointed?
|
How well-integrated were the functions of the website?
How disjointed were the functions of the website?
|
Was the website consistent or inconsistent?
|
How consistent was the website?
How inconsistent was the website?
|
Will people learn to use this website quickly or slowly?
|
How quickly will people learn to use this website?
How slowly will people learn to use this website?
|
Was this website cumbersome or efficient to use?
|
How cumbersome was the website?
How efficient was the website?
|
After each group of questions, I will ask participants the following questions about the format of the questions they just answered.
How did you feel about the format of these questions?
Did you like one format better than the other? If so, which one? Why?
To administer this study, I will:
Welcome participants.
Explain the purpose of the study
My name is Jean Fox, and I do usability work for BLS. That means that I try to design software and websites so that people can use them easily. We have a number of methods that we use to insure that our products are usable, and one of those methods is usability testing. In a usability test, we have typical users try to complete tasks with the software or website of interest. By observing and talking with the participants, we can learn where there are usability problems.
Today’s test is part of a series of research studies aimed at better understanding the process of usability testing. Other labs will be conducting a similar test with the same website to see what usability problems they uncover. We will be working with a commercial website that is not operated by BLS. Neither BLS nor the other organizations or individuals participating in the evaluation are involved with the website.
The website we’ll be using belongs to a car rental company. I will ask you to complete five tasks related to renting cars. We will record information such as task times and whether you completed the task successfully. We are testing the website, not you, so if you have difficulty, it is a problem with the website.
After you complete each task, I’ll ask you how easy or difficult you thought it was. After completing all five tasks, I’ll ask for your feedback on your overall experience with a series of ratings and open-ended questions. The session should last about an hour.
Explain the Informed Consent Form (Appendix B), and ask participants to sign it.
This form states that this research is voluntary and that you can withdraw at any time. We will videotape the session, but will not use it outside the research team unless you sign the next form. You will receive $40 for your participation, which I’ll give you when we’re done.
Explain the Video Release Form (Appendix C), and ask participants to sign it. Participants will not have to sign the form to participate.
Since this work is part of a larger, cross-organizational study, this form will allow us to share the video with others for research purposes. We will only share the video of the screen, not your face. You don’t need to sign this to participate in the study.
Provide each of the five tasks (Appendix A) to the participants in random order. Ask participants to rate the ease of use for each task.
Ask participants to complete the SUS questionnaire online (Appendix D), with the questions in the order given.
Ask several open-ended questions regarding their experience.
What features did you like about the website?
What features didn’t you like about the website?
The home page claims that you can “Rent a car in 60 seconds.” What does this mean to you? Does your experience support that claim?
Do you have any suggestions for improving the website?
Do you have any other comments or suggestions?
Ask half of the participants to complete the construct-specific, semantic differential ratings, and the other half to complete the branched ratings.
Ask the participants to complete the other set of ratings.
Ask participants the following questions about the format of the questions they just answered.
How did you feel about the format of these questions?
Did you like one format better than the other? If so, which one? Why?
Ask if the participants have any other comments or questions.
Pay participants and have them sign the receipt.
File Type | application/msword |
File Title | December 1, 2008 |
Author | LAN User Support |
Last Modified By | Nora Kincaid |
File Modified | 2009-04-28 |
File Created | 2009-04-28 |