Home Visiting Evidence of Effectiveness – Website Usability Testing

Fast Track Generic Clearance for Collection of Qualitative Feedback on Agency Service Delivery

Fast-track clearance -UsabilityTest-Instructions-Scripts-Correspondence-HomVEE_clean_11.29.23

Home Visiting Evidence of Effectiveness – Website Usability Testing

OMB: 0970-0401

Document [docx]
Download: docx | pdf

Home Visiting Evidence of Effectiveness – Website Usability Testing instrument, instructions, and scripts

Document Overview

This document describes our plan for testing the usability of the Home Visiting Evidence of Effectiveness (HomVEE) website functionality and features in one-on-one user testing and interviews.

Participants

The usability test/user interview participants will include members of the website’s intended audience:

  • Health Resources & Services Administration (HRSA) staff and other federal staff

  • Tribal Home Visiting staff

  • Researchers affiliated with Home Visiting Applied Research Collaborative (HARC) or experts HomVEE has consulted as part of our work

  • Field practitioners

  • Home visiting model developers and MIECHV awardees

Procedure

Each usability test session will be scheduled for approximately 60 minutes and will include one tester. The testing will be conducted remotely using the Webex video conferencing platform. Test sessions will be recorded and the recordings will be shared for analysis with project staff members only.

Testers will be emailed an invitation for the test (see Attachment). The invitation will briefly explain the purpose and the reasons for the usability test. Once they agree to participate, testers will receive a separate meeting invitation that includes the Webex meeting information.

During the test, testers will share their screen with the facilitator and observers so that the testers’ actions can be observed. The facilitator will brief testers about the purpose of the test and instruct testers that they will be evaluating the website, rather than the facilitator evaluating the tester. The facilitator will ask testers for permission to record their tests. Testers will be asked to complete a brief pre-test questionnaire (see below to view questionnaire). The facilitator will share copies of the test scenarios with testers. The facilitator will instruct testers to begin tasks when they are ready, to read aloud each task scenario prior to starting a task, and to move on to the next task when they feel they’ve completed the previous task. Website usability assesssment will begin once testers start their first task. The facilitator will encourage the testers to ‘think aloud’ as they try to complete tasks. The facilitator and test observers will observe, noting testers’ behavior and comments. The facilitator may prompt testers to explain their actions. The facilitator may help testers to get back on track with tasks if their actions are no longer helpful for discovering usability issues. After all tasks have been attempted, the tester will complete a post-test questionnaire (see below to view the questionnaire).

Usability Test Script

First of all let me introduce myself, my name is [Facilitator name] and I am a [Facilitator position on project]. I’m going to walk you through this usability test of the HomVEE website. Your participation is completely voluntary, and we are grateful for your help. I need to tell you that an agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB number for this information collection is 0970-0401 and the expiration date is MM/DD/YYYY.

You probably already have some idea of what we’re trying to do here, but let me give you a brief overview. The purpose of this usability test is to determine if people find the HomVEE website easy to use and navigate. To that end, I’m going to ask you to perform some tasks on the HomVEE website. As you work through these tasks, try to think out loud and tell us what you’re looking at, what you’re trying to do, and what you’re thinking. Work at your own pace, there is no rush. Go on to next task when you’re ready.

The test shouldn’t take more than an hour. If you need to take a break at any point, just let me know. If you have any questions as we go along, just ask them. I may not be able to answer them right away, since we’re interested in seeing what people do without help, but if you still have any questions when we’re done I’ll try to answer them then. If you think that you are getting stuck on a particular task, feel free to go back and re-read the task question.

One thing I want to make clear is that we’re testing the website, not you. You can’t do anything wrong here. Also, please don’t worry about hurting our feelings. We’re doing this to improve the site, so we need to hear your honest feedback.

With your permission, we’re going to record what happens on the screen and our conversation. The recording will only be used to help us figure out how to improve the site, and it won’t be seen by anyone except the people working on this project.

  • HIT THE RECORD BUTTON!!!

Now, I am going to ask you a few quick questions before we start the usability test:

Pre-Test Questions

  1. Which describes you best: I am a researcher. I am a policy maker. I design and/or implement early childhood home visiting models.

  2. Do you use websites to find research?

    • If yes, how often? What tasks are you trying to accomplish on those sites? Based on your experience, what do you like and dislike about the sites you’ve used to find research?

  1. Have you used any other evidence review websites, for example the Dept. of Education’s What Works Clearinghouse, The Department of Labor’s Clearinghouse for Labor Evaluation and Research, ACF’s Pathways to Work Evidence Clearinghouse, ACF’s Employment Strategies Evidence Review, or The Prevention Services Clearinghouse?

    • If yes, what other sites have you used? What do you like/dislike about these sites?

  2. Do you have any favorite websites, could be research or non-research related, that you love to use? What do you like about them? Which websites do you like?

Usability Tasks

Facilitator says: Thank you, the context you provided was very helpful.

We have tasks that I’d like you to attempt today. Please read the tasks out loud before you start each one. Move on to the next task when you feel you have completed the previous task. Start whenever you feel ready and once again please say aloud what you’re looking at, what you’re trying to do, and what you’re thinking.

Note: The specific tasks tested with each tester will be tailored based on the status of site redesign and the tester’s relationship to the project, but will include some combination of the types of sample tasks below.

  1. Objective: Test the site messaging and theme.

  • Can users determine the purpose of the site from the home page?

  • Does the home page encourage users to continue exploring the site and using site features?

  • Does it suggest actions for the users to take when visiting the site?

Sample tasks/questions:

  1. Looking at the website’s home page, what kind of information do you think you would find on this site? What is the first thing that caught your attention? What action do you think you would take next?

  2. Looking at the website navigation items and without visiting those sections, can you tell me what information or features you would expect to find in each of those sections?



  1. Objective: Test for usability issues related to the About section.

  • Can the user find information about the purpose and scope of the evidence review?

  • Will users expect to find this information in the About section of the site, or would they expect to find it somewhere else?

Sample tasks/questions:

  1. What area or topic of research is reviewed on the website?

  2. What types of studies were reviewed?

  3. What are the requirements a study must meet to be included in the evidence review?

  4. How do studies with high ratings differ from studies with low ratings?



  1. Objective: Test for usability issues related to the Publications section.

  • Does the layout of the publication section give users the information they need to determine if the publication contains the desired information?

Sample tasks/questions:

  1. What types of publications are available through the site? What topics are covered?

  2. Can you find a publication about [publication subject]?

  1. Objective: Test usability of the model and manuscript searches.

  • Does the layout and number of filter options make the search filters difficult to use?

  • Do keyword and filter searches return the expected results? Can users successfully use search features to locate the most useful information? Can users easily use the filters and keyword search features in conjunction with each other?

  • Does the layout and detail in the search results help users find relevant results or does it just make it harder to scan the results to quickly identify models of interest?

Sample tasks/questions:

  1. Where would you go to search for [models/manuscripts]? Please navigate to that page. What did you notice first on this page? Which of the search filters do you think would be most helpful to you? Are there any filters you think would be helpful that are missing? Do you have a sense of the volume of [models/manuscripts] reviewed?

  2. You are interested in home visiting programs focused on new parents of children between 0 and 12 months of age. Try to find a home visiting model [with evidence it improves child health outcomes] How many models did you find? Did you expect to find this number of models?

  3. Starting with the search results from the previous task, can you find a model that looks interesting to you? What made you choose that model?

  4. You are searching for studies of home visiting programs focused on pregnant people. Find a home visiting program evaluation that was published in 2010, found favorable program impacts, had evidence of effectiveness (met HHS criteria), and evaluated a program that targets pregnant people.

  5. You heard good things about a model used with tribal populations. You know there was an evaluation of this program that examined maternal health outcomes and included research co-authored by Valerie Coho-Mescal. Can you find information about this manuscript?



  1. Objective: Test usability of the model and manuscript detail pages.

  • Is it easy to locate the information most important to users?

  • Can the user determine how relevant the research is for them in terms of things like program implementation, population served and setting?

  • Do users understand the terms used on the page? Do they understand the details associated with the study or project?

Sample tasks/questions:

  1. Find research on models that target parents of children aged 12 to 23 months. Looking at the search results, find a [model/manucript] that looks interesting. What made you choose this [model/manuscript] ? Go to this [model/manuscript]’s detail page. Is this what you expected to see?

  2. Where would you look to review study findings?

OR

Where would you look to understand the effectiveness of the model’s approach?

  1. Can you descripe the study sample?

OR

Can you descripte the model’s services?

  1. If you were selecting a home visiting model to implement, , would you be interesting in choosing the model evaluated by this research, why or why not?

  2. Is there any information of interest to you missing from this page?

Post-Test Questions

Faciliator says: Thank you. That was a very helpful session. Your feedback is very valuable to us.
Before we wrap up, could you please finish this quick questionnaire?

  1. Please rate the following on a scale of 1 to 5, where 5 is strongly agree
    and 1 is strongly disagree:

    • I understand the purpose of this website.

    • The search filters were easy to use and helped me to find useful information.

    • It was easy to understand study details.

    • It was easy to understand model details.

    • It was easy to complete the test tasks.
      If you indicated above that anything was difficult, please describe why.

  1. What would be your main reason for using this website?

  2. If there was only one thing you could change about the website, what would it be?
    On the flip side, if there was only one thing you could keep the same, what would it be?

  3. What is your overall impression of the website? Is it easy to use? Is it efficient?

Usability Testing Metrics

This section describes the usability metrics our team will use to assess the test outcome.

  • Efficiency. Do testers find it easy to complete tasks? Measured by time spent on task and observations of testers' struggles.

  • Success rate. Can testers complete tasks? Measured as the percentage of testers who complete tasks without critical errors.

  • Accuracy. Are testers able to complete tasks correctly? Measure depends on the task at hand. For example, when asked to complete a specific search, did testers get the expected results?

  • Satisfaction. Are testers satisfied with their experience? Measured qualitatively, based on tester feedback and facilitator's observations of testers' struggles and success.









Attachment

Invitation to Potential HomVEE Website User Tester

From: [email protected]

Subject: Request for Help Testing Website on Evidence-Based Early Childhood Home Visiting Models

CC: [email protected]

Dear [Dr/Mr/Ms. X],

I am writing to ask if you would consider participating in a test of the Home Visiting Evidence of Effectiveness (HomVEE) Review’s website. HomVEE assesses the evidence of effectiveness for early childhood home visiting models, and shares its review results via the HomVEE website. (There is a brief overview of the project at the end of this email.)

We are currently redesigning the HomVEE website and are seeking [Type of website user] willing to help us test the website. [Tailored based on how we received contact info: Our project officers at the United States Department of Health and Human Services] recommended we contact you because they believed you or someone on your staff would give thoughtful consideration to our website and how it could be enhanced to best meet your needs.

The testing involves a one-hour call to review a version of the redesigned HomVEE website. We will use your input to check updates we have already made and to inform our next steps as we continue with the redesign. This would require no preparation, and would simply be a time for us to watch you interact with the site and then ask you about your experience. During the test, you would share your screen with us via WebEx. This call would happen within the next few weeks, at a time convenient for your schedule.

Please let us know if you are willing to be a HomVEE website tester. If you are, our test facilitator will contact you to identify a good time for the test and set up the WebEx meeting. Additionally, if you have colleagues that you think might be interested in helping us to shape this site for [Type of website user], please do help us connect with those people as well.

Thank you for your consideration of this opportunity.

Sincerely,

Rebecca Coughlin

Project Director

About HomVEE and this activity

HomVEE is a comprehensive review of the evidence base for early childhood home visiting models for families with pregnant people and children from birth to kindergarten entry. The review is conducted by Mathematica and sponsored by the Office of Planning, Research, and Evaluation (OPRE) of the Administration for Children and Families (ACF) within the U.S. Department of Health and Human Services (HHS), in partnership with the Health Resources and Services Administration (HRSA). For more information, visit the project’s website: https://homvee.acf.hhs.gov.

PAPERWORK REDUCTION ACT OF 1995 (Pub. L. 104-13) STATEMENT OF PUBLIC BURDEN: Through this information collection, ACF is gathering information to inform the re-design of a public website containing information about evidence-based home visiting programs. The purpose of this information collection is to inform the website’s content and design. Public reporting burden for this collection of information is estimated to average 60 minutes per respondent, including the time for reviewing instructions, gathering and maintaining the data needed, and reviewing the collection of information. This is a voluntary collection of information. agency may not conduct or sponsor, and a person is not required to respond to, a collection of information subject to the requirements of the Paperwork Reduction Act of 1995, unless it displays a currently valid OMB control number. The OMB # is 0970-0401 and the expiration date is MM/DD/YYYY. If you have any comments on this collection of information, please contact your test facilitator.


Email for Usability Test Webex meeting

From: [Test Facilitator]

Subject: Today’s HomVEE Website Test

Hello [Name],

Thank you for agreeing to participate in a usability test of the HomVEE website. In preparation for our meeting today, scheduled for [time], I wanted to share a few additional materials.

  • Testing website: URL

  • Test tasks (we will attach to the email a PDF listing the test tasks with one task per page)

Please refrain from opening these materials until the start of the usability test.

Please let me know if you have any questions, I look forward to meeting you.

Kind regards,

[Test Facilitator]



PAPERWORK REDUCTION ACT OF 1995 (Pub. L. 104-13) STATEMENT OF PUBLIC BURDEN: Through this information collection, ACF is gathering information to inform the re-design of a public website containing information about evidence-based home visiting programs. The purpose of this information collection is to inform the website’s content and design. Public reporting burden for this collection of information is estimated to average 60 minutes per respondent, including the time for reviewing instructions, gathering and maintaining the data needed, and reviewing the collection of information. This is a voluntary collection of information. agency may not conduct or sponsor, and a person is not required to respond to, a collection of information subject to the requirements of the Paperwork Reduction Act of 1995, unless it displays a currently valid OMB control number. The OMB # is 0970-0401 and the expiration date is MM/DD/YYYY. If you have any comments on this collection of information, please contact your test facilitator.



7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAivo Kivi
File Modified0000-00-00
File Created2024-07-25

© 2024 OMB.report | Privacy Policy