MHV Summative Test

Generic Clearance for the Collection of Qualitative Feedback on Agency Service Delivery (NCA, VBA, VHA)

2017 MHV Summative UT_StudyProposal

MHV Summative Test

OMB: 2900-0770

Document [docx]
Download: docx | pdf

Header: U.S. Department of Veteran Affairs, Veteran Health Administration, Office of Informatics and Information Governance, Human Factors Engineering Shape1








MHV Summative Test

OMB No. 2900-0770
Estimated Burden: 60 minutes

Expiration Date: 08/31/2017









The Paperwork Reduction Act of 1995: This information is collected in accordance with section 3507 of the Paperwork Reduction Act of 1995. Accordingly, we may not conduct or sponsor and you are not required to respond to, a collection of information unless it displays a valid OMB number. We anticipate that the time expended by all individuals who complete this survey will average 60 minutes. This includes the time it will take to follow instructions, gather the necessary facts and respond to questions asked. Customer satisfaction is used to gauge customer perceptions of VA services as well as customer expectations and desires. The results of this summative test will lead to improvements in the quality of service delivery by helping to achieve services. Participation in this survey is voluntary and failure to respond will have no impact on benefits to which you may be entitled.











PROPOSAL FOR Summative UT of the my healthevet prototype website

Draft v0.2


Veterans Health Administration (VHA)

Human Factors Engineering (HFE)

05/05/2017





Version History

Version

Date

Comments

0.1

04/28/2017

Initial development based on kick-off meeting notes

0.2

05/052017

Draft delivered to William Plew













Document Approvals

Proposal for Summative UT of the My HealtheVet Prototype Website

Date:

Submitted By

_______________________________________ _____________________

[HFE POC Name] Date

[Title]

Human Factors Engineering (HFE), Office of Health Informatics and Information Governance (OIIG)

Veterans Health Administration

Concurrence

_______________________________________ _____________________

[Product Owner Name] Date

[Title]

[Program Office]


_______________________________________ _____________________

[Development POC Name] Date

[Title]

[Organization]



Executive Summary
This proposal was prepared in response to a request for a Summative Usability Test of the My HealtheVet prototype website to assess its usability at this stage in its development. The following proposal outlines the details of the study to include the objectives, scope, test environment and data, participants and any constraints and limitations.

The scope of the testing will focus on the evaluation of key capabilities using a summative test methodology, repeating tasks (or analogous tasks) from the baseline study. A total of 18 Veteran participants – both users and non-users of the current version of MHV – representing the range of target users recruited in advance via HFE’s recruitment vendor.

During each 60-minute session, participants will be asked to complete ten common tasks in six of the most commonly used focus areas of MHV. Both qualitative and quantitative data will be collected and key measures will include task success, task errors or difficulty, and overall effectiveness, efficiency and satisfaction. A task-based usability scale will also be employed to assess the user’s perception of the user experience.

The final report will include metrics, along with an interpretation of those metrics, as well as a list of ranked findings with actionable recommendations for remediation and/or improvement. Additional suggestions from participants for development of My HealtheVet will also be collected and included in the final report.

Introduction

Study Details

Proposal Author(s): Robert Gluck (VHA OIIG HFE, ArcSource Group; [email protected])

William Plew (VHA OIIG HFE; [email protected])

HFE Point of Contact: Nancy Wilck (VHA OIIG HFE; [email protected])

Application: My HealtheVet Prototype

Application Version: Current build

Study Sponsor: Jeff Sartori, Connected Care

Developer POCs: TBD

Device(s): Laptop and desktop computers

Application Description

My HealtheVet (MHV) was designed for Veterans, active duty service members, their dependents, and caregivers. It helps them to partner with their health care team by providing information and tools to make informed decisions and manage their health care. After producing and supporting MHV for more than a dozen years, the VA is developing a redesign of the patient portal, based on user feedback, new technologies, Industry standards and needs. The patient portal’s development team is building the new MHV portal on a LifeRay™ Content Management System (CMS) framework.

Objectives

The following study objectives represent the shared, high-level understanding between the HFE practitioners and the Connected Care program office requesting the UT. The objectives reflect an agreement on the purpose of conducting the study, the expected outcomes (i.e. the type of results that will be provided), and how the results are expected to be used.

  1. In 2015, HFE conducted a baseline study of the previous version of My HealtheVet. In 2016, HFE conducted a summative study of the redesign of My HealtheVet, repeating tasks (or analogous tasks) from the baseline study. This summative study will measure the improvements from the baseline study, and the first summative study.

  2. Verify that the system features developed to date can be effectively, efficiently and satisfactorily used.

  3. Identify any “Serious” usability issues. If “Serious” issues are identified, HFE recommends that they be addressed prior to deployment.1

  4. Collect baseline data to determine whether users perceive the level of detail and the time required to perform tasks to be reasonable.

  5. Capture any new user requirements requested by participants that could support development of the features.

  6. Collect general feedback from participants about their experience using the product and offer recommendations for improvements that may help users better achieve their goals and complete their tasks.

Discoveries of where the My HealtheVet prototype website facilitated the workflow (that is, effectively, efficiently, and with satisfaction) as well as areas where it failed to meet the needs of the participants will be identified, documented, and ranked for severity and priority. Findings will also be accompanied by actionable recommendations for improvement. Testing will examine not only routine tasks but also unintended uses and potential for errors. The study will use a standard usability scale to evaluate task usability as well as other subjective questionnaires.

Proposed Study Design

HFE will perform summative testing on the My HealtheVet prototype website to deliver on the study objectives. Tasks performed by participants were developed for the original baseline test in collaboration with the Veterans and Consumer Health Informatics Office (V/CHIO).

Test Environment

Sessions will take place remotely with participants at their own location. The participant will join via web conference (most likely, WebEx™) and interact with the [product] by controlling an HFE laptop via the screen-sharing features of the application. Sessions (audio and screen actions) will be recorded using Morae™ software (v3.3.3) installed on the HFE laptop. Morae2 is usability software which captures pre-defined metrics for the tasks, including clicks, mouse movement and task time. After the session, the recording file will be transferred to the moderator via a secure File Transfer Protocol (FTP) utility for coding and analysis.

Test Data

The testing will require that participants interact with identical data as they perform the standardized tasks. In order to accommodate for this, data will be identified within the testing environment. Test scenarios will be developed for the tasks based on available data in that selected data. The data will likely need to be augmented or corrected to ensure that it is complete (to support the tasks) and correct (accurate).

Test Measures

The report issued on the results of this study will provide measures as follows:

  • Effectiveness – Objective measures of task success, task failures, and errors.

  • Efficiency – Objective measures of time on task and number of clicks to complete each task.

  • Satisfaction – Subjective measures that express user satisfaction with the ease of use of the system.

Success criteria and benchmark task times will be based on the 2016 summative test.

Participants

A total of 18 participants will be recruited for the UT, with additional participants recruited for practice or dry run purposes. Recruited participants will have a mix of backgrounds and demographic characteristics conforming to the recruitment screener. Participant names will be replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

HFE will schedule each participant to attend one individual UT session (60 minutes). Sessions will be scheduled with at least 30 minutes in between each, for debrief by the administrator(s) and data logger(s), and to reset HFE test tools to the proper test conditions. A maximum of four participants will be scheduled for each test session day, resulting in 5-10 days of testing.

Constraints & Limitations

The study design imposes a series of risks for consideration by the Program Office. They include:

  • Incomplete My HealtheVet prototype build. The build may not be complete prior to the testing sessions. Testing must be scheduled in order to complete the study in a timely manner as inform development prior to deployment. See schedule. As a result, usability findings may be limited or incomplete. Additional usability evaluation is also recommended in conjunction with the any field testing.

  • Limited functionality available in the My HealtheVet prototype version. In order for participants to seamlessly interact with the system to accomplish a task, they may want to access areas of the system that are not yet functional. This could result in inaccurate task times or impact collection of errors. If this occurs, it will be noted and communicated to the Program Office with the final report.

  • Stability of Test Environment. The test will be conducted in an environment that may become unresponsive at times. Should this occur during a test session, it could result in in lost data or the need to reschedule the session. If this occurs, HFE will discuss schedule slippage with the Program Office. Together, the team will decide if we will stop the study or extend the schedule.

  • Network or System Latency. Remote UT introduces the strong possibility that the participant will experience latency during the session. This lag can impact quantitative metrics of task time and influence perceived participant satisfaction with the system. It can also potentially result in lost sessions or the need to reschedule sessions. Again, if this occurs, HFE and the Program Office will decide on next steps.

  • Test Accounts and Simulated Data. Test accounts with test data will be reviewed for [clinical] relevance and accuracy. However, data will be fictional. HFE will review test data with a subject matter experts to perform the due diligence to ensure data accuracy and avoid any distractions to the participant.

  • Holidays and Vacations. May 29 and July 4 are Federal holidays. HFE staff will be on vacation as follows:

    • Robert Gluck: June 8, 9, 12

    • William Plew: June 12-16

These risks and limitations will be recommunicated to the Program Office in conjunction with the final report.

Proposed Schedule

The following timeframes are provided to plan, execute and report on the study. This schedule is subject to change.

Milestone/Tasks

Date

Approx. Duration

Responsible

Study Proposal (this document)

May 8, 2017

N/A

HFE

Participant Recruiting

May 9-22, 2017

10 days

HFE

Refresh Test Environment

May 15, 2017

1 day

Dev Team

Scenarios/Tasks Working Session (to finalize materials)

May 16, 2017

One 2-hour session

HFE, Program Office

Verification of Test Configuration

May 17, 2017

One 2-hour session

HFE, Program Office

Study Plan Delivery


2 weeks before Test

HFE

Dry Run with Participant

May 18, 2017

1 day

HFE

Test Sessions

May 23-June 6, 2017

10 days

HFE

Findings and Data Compilation

(produce report; includes internal HFE peer review)

June 13-July 13, 2017

16 days

HFE

Draft Report Briefing

July 17, 2017

2 hours

HFE

Final Report Delivery

July 19, 2017

2 days

HFE

Table 2: Proposed Study Schedule

*Dates are tentative and may change based on scheduling and other constraints.

1 For more information on HFE’s ranking system, see http://hcs.sagepub.com/content/4/1/23.abstract or contact Ashley Cook for a copy of the paper.

2 For more information about Morae: https://www.techsmith.com/morae.html

Shape2 HFE_style guide_footer.png

9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy