Usability testing for the "Pathways to Work Evidence Clearinghouse" website

Fast Track Generic Clearance for Collection of Qualitative Feedback on Agency Service Delivery

Fast-track clearance - Pathways to Work Evidence Clearing House - Protocol

Usability testing for the "Pathways to Work Evidence Clearinghouse" website

OMB: 0970-0401

Document [docx]
Download: docx | pdf

Pathways to Work Clearinghouse – Website Usability Testing Scope and Protocol

Document Overview

This document describes our plan for testing the usability of the Pathways to Work Evidence Clearinghouse website functionality and features.

Objectives

  • Review and test the evidence review website under controlled test conditions with members of the site’s audience. The testing data will be used to assess whether usability goals for an effective, efficient, and well-received user interface have been achieved.

Potential usability issues may include:

    • Messaging issues – Users should be able to easily determine the purpose and value of the site.

    • Navigation issues – Users should be able to easily locate features and information.

    • Scannability issues – Information should be presented in a way that is hierarchical, and easy to scan and consume.

    • Accuracy issues – Users actions should produce desired results.

    • Language or terminology issues – Site should use language and terms familiar to users. Help understanding site terms should be readily available.

  • Better understand any usability issues and their underlying cause so that they can be resolved.

Participants

The usability test participants will include members of the website’s intended audience: TANF administrators, federal staff, and researchers.

Roles

The roles involved in the usability test are as follows:

Facilitator

  • Explains purpose of usability testing to test participants.

  • Keeps participants focused on tasks.

  • Prompts participants for their input and encourages them to think out loud.

  • Administers pre-test and post-test questionnaires.

  • Responds to participant's requests for assistance before, during, and after test.

Test Observers

  • Silent observers

  • Serve as note takers


Testers

  • Attempt to complete a set of representative tasks in as efficient and timely a manner as possible.

  • Provide feedback regarding the usability of site features and functions. The testers will be directed to provide honest opinions regarding the usability of the application.

  • Complete pre and post-test subjective questionnaires.

Procedure

Each usability test session will be scheduled for approximately 45 minutes and will include one tester per session. The testing will be conducted remotely using the Webex video conferencing platform. Test sessions will be recorded and the recordings will be shared for analysis with project staff members only.

Testers will be emailed an invitation for the test (see Attachment). The invitation will briefly explain the purpose and the reasons for the usability test. Once they agree to participate, they will receive a separate calendar reservation that includes the Webex meeting information.

During the test, testers will share their screen with the facilitator and observers so that the testers’ actions can be observed. The facilitator will brief testers about the purpose of the test and instruct testers that they will be evaluating the website, rather than the facilitator evaluating the tester. The facilitator will ask testers for permission to record their tests. Testers will be asked to complete a brief pre-test questionnaire (see below to view questionnaire). The facilitator will share copies of the test scenarios with testers. The facilitator will instruct testers to begin tasks when they are ready, to read aloud each task scenario prior to starting a task, and to move on to the next task when they feel they’ve completed the previous task. Usability task measuring will begin once testers start their first task. The facilitator will encourage the testers to ‘think aloud’ as they try to complete tasks. The facilitator and test observers will observe, noting testers’ behavior and comments. The facilitator may prompt testers to explain their actions. The facilitator, may help testers to get back on track with tasks if their actions are no longer helpful for discovering usability issues. After all tasks have been attempted, the tester will complete a post-test questionnaire (see below to view the questionnaire).

Usability Test Script

First of all let me introduce myself, my name is [Facilitator name] and I am a [Facilitator position on project]. I’m going to walk you through this usability test of the Pathways to Work Clearinghouse. Your participation is completely voluntary, and we are grateful for your help. I need to tell you that an agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB number for this information collection is 0970-0401 and the expiration date is 05/31/2021.

You probably already have some idea of what we’re trying to do here, but let me give you a brief overview. The purpose of this usability test is to determine if people find the Pathways to Work Clearinghouse easy to use and navigate. To that end, I’m going to ask you to perform some tasks on the Pathways to Work Clearinghouse website. As you work through these tasks, try to think out loud and tell us what you’re looking at, what you’re trying to do, and what you’re thinking. Work at your own pace, there is no rush. Go on to next task when you’re ready.

The test shouldn’t take more than an hour. If you need to take a break at any point, just let me know. If you have any questions as we go along, just ask them. I may not be able to answer them right away, since we’re interested in seeing what people do without help, but if you still have any questions when we’re done I’ll try to answer them then. If you think that you are getting stuck on a particular task, feel free to go back and re-read the task question.

One thing I want to make clear is that we’re testing the website, not you. You can’t do anything wrong here. Also, please don’t worry about hurting our feelings. We’re doing this to improve the site, so we need to hear your honest feedback.

With your permission, we’re going to record what happens on the screen and our conversation. The recording will only be used to help us figure out how to improve the site, and it won’t be seen by anyone except the people working on this project.

  • HIT THE RECORD BUTTON!!!

Now, I am going to ask you a few quick questions before we start the usability test:

Pre-Test Questions

  1. Which describes you best: I am a researcher. I am a policy maker. I am a practitioner.

  2. Do you use websites to find research?

    • If yes, how often? What tasks are you trying to accomplish on those sites? Based on your experience, what do you like and dislike about the sites you’ve used to find research?

  1. Have you used any other evidence review websites, for example the Dept. of Education’s What Works Clearinghouse, The Department of Labor’s Clearinghouse for Labor Evaluation and Research, or ACF’s Home Visiting Evidence Review or Employment Strategies Evidence Review?

    • If yes, what other sites have you used? What do you like/dislike about these sites?

  2. Do you have any favorite websites, could be research or non-research related, that you love to use? What do you like about them?

Usability Tasks

Facilitator says: Thank you, the context you provided was very helpful.

We have tasks that I’d like you to attempt today. Please read the tasks out loud before you start each one. Move on to the next task when you feel you have completed the previous task. Start whenever you feel ready and once again please say aloud what you’re looking at, what you’re trying to do, and what you’re thinking.


Note: The specific tasks tested with each tester will be tailored based on the phase of site development and the tester’s relationship to the project, but will include some combination of the types of sample tasks below

  1. Test Objective: Test the site messaging and theme.

  • Can users determine the purpose of the site from the home page?

  • Does the home page encourage users to continue exploring the site and using site features?

  • Does it suggest actions for the users to take when visiting the site?

Sample tasks:

  1. Looking at the website’s home page, what kind of information do you think you would find on this site? What is the first thing that caught your attention? What action do you think you would take next?

  2. Looking at the website navigation items and without visiting those sections, can you tell me what information or features you would expect to find in each of those sections?



  1. Task objective: Test for usability issues related to the About section.

  • Can the user find information about the purpose and scope of the evidence review?

  • Will users expect to find this information in the About section of the site, or would they expect to find it somewhere else?

Sample Tasks:

  1. What area or topic of research is reviewed on the website?

  2. What types of studies were reviewed?

  3. What are the requirements a study must meet to be included in the evidence review?

  4. How do studies with high ratings differ from studies with low ratings?



  1. Task objectives: Test for usability issues related to the Publications section.

  • Does the layout of the publication section give users the information they need to determine if the publication contains the desired information?

Sample Tasks:

  1. What types of publications are available through the site? What topics are covered?

  2. Can you find a publication about [publication subject]?

  1. Test Objective: Test usability of the study and project searches.

  • Does the layout and number of filter options make the filter option lists difficult to use?

  • Do keyword and filter searches return the expected results? Can users successfully use the filters, keyword search, and the sort to locate the most useful information? Can users easily use the filters and keyword search features in conjunction with each other?

  • Does the layout and detail in the search results help users find relevant results or does it just make it harder to scan the results to quickly identify studies of interest?

Sample Tasks:

  1. Where would you go to search for [studies/projects]? Please navigate to that page. What did you notice first on this page? Which of the search filters do you think would be most helpful to you? Are there any filters you think would be helpful that are missing? Do you have a sense of the volume of [studies/projects] reviewed?

  2. You are interested in programs that will improve employment outcomes for low income adults. Try to find [studies of high quality that examined long-term employment outcomes/projects that strive to improve long-term employment outcomes] How many [studies/projects] did you find? Did you expect to find this amount of studies?

  3. Starting with the search results from the previous task, can you find a [study/project] that looks interesting to you? What made you choose that [studies/projects]?

  4. You are searching for studies that evaluated programs targeting the unemployed in urban regions. Find a job placement program evaluation that was published in 2010, found favorable program impacts, had high strength of evidence, and evaluated a program that targets the unemployed living in urban areas.

OR

You are searching for programs targeting the unemployed in urban regions. Find a job placement program with evidence of effectiveness at improving short-term employment outcomes and targets the unemployed living in urban areas.

  1. You heard good things about a program used in Oklahoma City. You know there was an evaluation of this program that examined long-term employment outcomes and was authored by Laura Storto. Can you find this study?

OR

You heard good things about a program that was part of the NEWWS evaluation and you think you remember it was in Atlanta. It improved employment and earnings. Can you find this study?



  1. Test Objective: Test usability of the study and project detail pages.

  • Is it easy to locate the information most important to users?

  • Can the user determine how relevant the research is for them in terms of things like program implementation, population served and setting?

  • Do users understand the terms used on the page? Do they understand the details associated with the study or project?

Sample Tasks:

  1. Find [studies of services/projects] that target single mothers. Looking at the search results, find a [study/project] that looks interesting. What made you choose this [study/project] ? Go to this [study/project]’s detail page. Is this what you expected to see?

  2. Where would you look to understand what the study found?

OR

Where would you look to understand the effectiveness of the project’s approach?

  1. Can you descripe the study sample?

OR

Can you descripte what services are included in the project?

  1. If you were a provider of employment services to single working mothers, would you be interesting in implementing the [program or services evaluated by this study/this project’s approach], why or why not?

  2. Is there any information of interest to you missing from this page?

Post-Test Questions

Faciliator says: Thank you. That was a very helpful session. Your feedback is very valuable to us.
Before we wrap up, could you please finish this quick questionnaire?

  1. Please rate the following on a scale of 1 to 5, where 5 is strongly agree
    and 1 is strongly disagree:

    • I understand the purpose of this website.

    • The search filters were easy to use and helped me to find the correct project or study.

    • It was easy to understand study details.

    • It was easy to understand project details.

    • It was easy to complete the test tasks.
      If you indicated above that anything was difficult, please describe why.

  1. What would be your main reason for using this website?

  2. If there was only one thing you could change about the website, what would it be?
    On the flip side, if there was only one thing you could keep the same, what would it be?

  3. Are there too many filters in the Search? Which ones would you be most likely to use?
    Which ones would you be less likely to use?

  4. What is your overall impression of the website? Is it easy to use? Is it efficient?

Usability Testing Metrics

This section describes the usability metrics our team will use to assess the test outcome.

  • Efficiency. Do testers find it easy to complete tasks? Measured by time spent on task and observations of testers' struggles.

  • Success rate. Can testers complete tasks? Measured as the percentage of testers who complete tasks without critical errors.

  • Accuracy. Are testers able to complete tasks correctly? Measure depends on the task at hand. For example, when asked to complete a specific search, did testers get the expected results?

  • Satisfaction. Are testers satisfied with their experience? Measured qualitatively, based on tester feedback and facilitator's observations of testers' struggles and success.

Reporting Results

The results of the usability test will be presented, either in a meeting or a memo following each round of testing. The presentation will include a summary of findings and recommendations for resolving any usability issues that were found during the test. The recommendations will also include a rating of importance to site usability and high level estimates of the level of effort to achieve, to help with prioritizing future site updates.



6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAivo Kivi
File Modified0000-00-00
File Created2021-05-04

© 2024 OMB.report | Privacy Policy