Total Overview

Supporting Statement_101818.doc

Generic Clearance for the Collection of Qualitative Feedback on Agency Service Delivery

Total Overview

OMB: 0935-0179

Document [doc]
Download: doc | pdf




SUPPORTING STATEMENT


Part A







Advancing the Collection and Use of Patient –Reported Outcomes through Health Information Technology

OMB CONTROL NUMBER: 0935-0179




Version: October 18th, 2018







Agency of Healthcare Research and Quality (AHRQ)




Contents






A. Justification


1. Circumstances that make the collection of information necessary


The mission of the Agency for Healthcare Research and Quality (AHRQ) set out in its authorizing legislation, The Healthcare Research and Quality Act of 1999 (see http://www.ahrq.gov/hrqa99.pdf), is to enhance the quality, appropriateness, and effectiveness of health services, and access to such services, through the establishment of a broad base of scientific research and through the promotion of improvements in clinical and health systems practices, including the prevention of diseases and other health conditions. AHRQ shall promote health care quality improvement by conducting and supporting:


1. research that develops and presents scientific evidence regarding all aspects of health care; and


2. the synthesis and dissemination of available scientific evidence for use by patients, consumers, practitioners, providers, purchasers, policy makers, and educators; and


3. initiatives to advance private and public efforts to improve health care quality.


Also, AHRQ shall conduct and support research and evaluations, and support demonstration projects, with respect to (A) the delivery of health care in inner-city areas, and in rural areas (including frontier areas); and (B) health care for priority populations, which shall include (1) low-income groups, (2) minority groups, (3) women, (4) children, (5) the elderly, and (6) individuals with special health care needs, including individuals with disabilities and individuals who need chronic care or end-of-life health care.


Patient-Reported Outcomes (PROs) are assessments that directly capture the medical symptoms of the patient based on their own perception including measures such as physical function, pain, and health-related quality of life, without interpretation of the patient’s responses by a clinician.12 PROs are critical to the effective care of patients with PRO data informing provider practice as well as supporting important research on patient outcomes.3 Technical tools, such as software applications, can serve to improve the collection and analysis of PRO data, however, this technology must be usable by patients, providers, and researchers.3 Further, there must be clear standards driving the development of PRO applications (apps), and the integration of these apps with EHRs, so that they can be consistently implemented in a seamless fashion and the resulting data can be easily combined for large-scale analyses. However, currently there are no such standards for PRO apps, resulting in challenges for providers, patients, and researchers. As a result, each data collection can be unique and require extensive resources from IT teams at each clinical site, making the current implementation of PRO apps burdensome, with technology having suboptimal usability.


The proposed effort is a critical step toward testing AHRQ standards for PRO app development, implementation, and effective use of data, by providers, patients, and researchers. A comprehensive report will be produced that clearly identifies success factors, barriers, and facilitators to implementing the PRO apps, usability of the apps, and recommendations for future activities relevant to the key stakeholders. This report will be instrumental to inform AHRQ’s objective of standardizing PRO applications and making these applications more usable and useful for all stakeholders.


This research has the following goals:

1) Understand the usability and functional requirements of stakeholders involved in the development, implementation, and use of PRO apps


2) Rigorously evaluate the implementation and use of two PRO apps that meet AHRQ’s technical specifications and requirements of patients, providers, and researchers.


To achieve the goals of this project, the following data collections will be implemented:

1) PROMIS® items: Two separately-developed apps will be used to collect PRO data on patients’ physical function using the Patient-Reported Outcomes Measurement Information System (PROMIS®) items. Both apps will incorporate AHRQ’s technical specifications to support the collection of standardized PRO data. The PROMIS Physical Function v2.0 is the core measure that will be administered at18 primary and specialty care practices (nine practices per app) in Washington DC, Maryland, and Virginia. Each practice site has the option to administer additional PROMIS items including the Mobility 2.0, Upper Extremity 2.0, Physical Function for Samples with Mobility Aid Users 1.0, and Physical Function short form 10a. Target patient populations include patients who are 65+, post-procedure, rehabbing, and English proficient. Ten patients will be selected from each practice site. Training materials have also been created to aid participants in ease of completion of this app.


2) System Usability Scale (SUS): The SUS is a brief questionnaire which is considered as the industry standard for evaluating usability of hardware, software, mobile devices, websites, and applications. The PRO app will be programmed to include the prompt requiring the 180 patients to complete the SUS right after they submit their PROMIS data.


3) Readiness to Change Assessment: To provide context for evaluation findings, all participating sites will complete a baseline organizational readiness to change assessment. This paper-based survey assesses organizational readiness to implement a new app, and provides an overall indication of the likelihood of its success prior to implementation.


4) Process Usage Worksheet (Patients): Prior to the app implementation, each practice site will complete a standard usage and process worksheet electronically every week for the duration of the 6-month period. Worksheets will track: 1) # of patients eligible to receive the PRO app; 2) # of patients receiving the PRO app; 3) # of patients submitting PRO data; and 4) # of patients who decline the PRO app. The worksheet will take fewer than five minutes to complete. In addition, the worksheet will be used examine passive usage data collected from the OBERD app such as the amount of time a patient uses the app or if they abandon it.


5) Process Usage Worksheet (Providers): To understand the provider organizations view of how the PRO measures are used by the clinical team, each site lead will complete a brief usage process worksheet collecting two data points each week. The worksheet will track: 1) # of patients whose PRO data was accessed by the provider; and 2) A rating of how informative the PRO data were to the goals of the provider accessing the information. Providers will be asked to rate this on a 1 to 5 scale with 5 being highly informative to their goals.


6) Patient Interviews: Semi-structured interviews will be conducted with up to 18 patients (up to nine patients for each app) to elicit feedback on general use of the PRO app, including ease of use, preferences and experience with the app. These interviews will occur during the testing period after all patients complete the survey. After each patient completes the survey, the practice staff will ask the patient if he/she is willing to be contacted for a 30-minute follow-up phone interview. Once all patients have been enrolled for the phone interview, we will randomly select one participant per practice out of those who consented to conduct the 30-minute follow up interview.


7) Provider Interviews: Following the completion of app usage by 10 patients at each site, semi-structured interviews will be scheduled and conducted by telephone with one provider at each of the 18 practice sites. During this interview, providers will be shown the usage statistics from the process worksheets completed for their practice. Topics addressed will include: 1) how they actually used PRO data to inform clinical decision making, 2) reasons for use or non- use of PRO data, 3) how can the process be optimized, and 4) general satisfaction, preferences, and barriers.


8) Health Information Technology (HIT) Professional Interviews: To understand IT staff’s experience with the implementation, semi-structured interviews will be conducted with one IT implementation staff at each of the 18 practice sites, focusing on satisfaction with the implementation process, preferences, and barriers.


This study is being conducted by AHRQ through its contractor, MedStar Health, pursuant to AHRQ’s statutory authority to conduct and support research on healthcare and on systems for the delivery of such care, including activities with respect to the quality, effectiveness, efficiency, appropriateness and value of healthcare services and with respect to quality measurement and improvement. 42 U.S.C. 299a(a)(1) and (2).


2. Purpose and Use of Information

To understand the needs of stakeholders, this project will use a rigorous mixed methods approach to elicit information about functionality and usability of two PRO apps, including barriers and facilitators to successful design, development, implementation, and use. One app will be modified from an existing app (OBERD app) that is being used at MedStar Health. The other app (challenge winning app) will be developed through a challenge competition held by AHRQ. Both of the apps will be implemented and evaluated in MedStar Health primary care and specialty care practices as well as the Capital Area Primary Care Research Network Practice Based Research Network (CAPRICORN PBRN) primary care practices. App implementation and evaluation at different sites will provide diverse testing conditions including variability in clinical workflow, patient demographics, and EHRs. The findings from this study will inform future modification of the two PRO apps that are being tested.


The proposed effort is a critical step toward testing AHRQ standards for PRO app development, implementation, effective use, and resulting data, by providers, patients, and researchers. Our team will produce a comprehensive report that clearly identifies success factors, barriers, and facilitators to implementing the PRO apps, usability of the apps, and recommendations for future activities relevant to the key stakeholders. This report will be instrumental to inform AHRQ’s objective of developing user-friendly apps to collect standardized PRO data.

3. Use of Improved Information Technology

Our technical team will work with each site to ensure that PRO data are seamlessly integrated with each provider’s EHR to make data available during clinic visits. The patient facing apps will be easily accessible by patients on a mobile device. The team will ensure the apps are correctly configured with the desired PROMIS measure. To minimize data collection burden on the patient, the Computer Adapted Test (CAT) of the selected PROMIS physical function measure will be used. An optimally designed app interface will allow patients to submit PRO data with minimum effort. A SUS will be administered to each patient who uses the app to submit PRO data. The PRO apps will be programmed to include the SUS questions right after the PROMIS items for patients to complete.


Rapid on-site integration testing will take place to ensure appropriate functioning in the specific clinical environment. Our usability core team will work with patients and providers at each site to ensure their needs and workflows are being met prior to apps go-live.


After completion of the six-month pilot, it will be necessary to remove any aspects of the technology that altered patient and provider workflows which the provider no longer wishes to retain.


4. Efforts to Identify Duplication

No similar data have been gathered by the research team or are available from other sources known to the research team.


5. Involvement of Small Entities

It is unlikely that any sites participating in this pilot test will be classified as small businesses.


6. Consequences if Information Collected Less Frequently

Data collected through the PRO app, the SUS, and the patient interviews will be collected only once from each patient.


Data collected through the Readiness for Change Assessment, provider interviews, and healthcare information technology professional interviews will be collected one time from each participant.


Data collected through the patient and provider process and usage worksheets will be collected once a week until full patient enrollment. Practice sites have indicated that in many cases, the target enrollment could be obtained in just a few days or weeks, and as such the usage worksheets will be collected once a week for the period that patients are actively being enrolled. The worksheets will take fewer than five minutes to complete which does not constitute a burden to the sites. If the patient and provide process and usage worksheets are not collected weekly, this may be a larger time burden for the site coordinators.


7. Special Circumstances

This request is consistent with the general information collection guidelines of 5 CFR 1320.5(d)(2). No special circumstances apply.


8. Federal Register Notice and Outside Consultations


8.a. Federal Register Notice

A Federal Register notice is not required for this generic clearance.


8.b. Outside Consultations

A technical expert panel (TEP) composed of patients, providers, researchers, and developers from industry and academia will inform all aspects of the project to ensure robust stakeholder input. The TEP represents expertise in HIT implementation and usability, application development, patient safety, and patient needs. The TEP will convene for three two hour-long virtual meetings, led by Dr. Rollin (Terry) Fairbanks, who is AVP of Ambulatory Quality and Safety at MedStar Health and is an expert in HIT, human factors engineering, and patient safety. The panel will provide input on the implementation plan and pilot testing findings

9. Payments/Gifts to Respondents

Patients completing the PROMIS items and SUS on the PRO apps:

In prior experience recruiting participants for stakeholder feedback, providing incentives of even relatively modest amounts significantly improves recruitment and retention. This is especially the case for patients, who otherwise have competing interests of time, and providers who tend to be extremely challenging to recruit given their schedules and availability. For example, on a previous ACTION III TO3, (patient and family engagement) incentives provided to primary care providers and patients were critical to recruiting sufficient numbers for semi-structured interviews during both the environmental scan, and the pre- and post-intervention stakeholder feedback sessions. In fact, oftentimes even with incentives available it proved challenging to recruit providers given the constraints of their schedules in between seeing patients and the potential time lost if they are not available to patients at any time point during their day. Additionally, we have extensive experience recruiting patients for interviews for similar interviews and surveys. In each instance recruitment and retention would not have proven successful without the use of financial incentives. In another study conducted in partnership with the American Medical Association, emergency medicine physicians were compensated for their time spent completing a usability evaluation and interview session.  Further, there is a body of literature that supports the use of incentives to effectively recruit and retain research participants in studies such as this456.


Data provided by Providers and HIT professionals: Providers and HIT professionals participating in this study will not receive any payment or remuneration. Participating sites will have valuable PRO data for key patient populations integrated to their EHR systems for providers to use during patients’ clinic visits. This was expressed as valuable in earlier provider interviews. Additionally, upon completion of data collection, each site may retain the elements of the technical setup that they wish to continue with.


10. Assurance of Confidentiality

Individuals and organizations will be assured of the confidentiality of their replies under Section 944(c) of the Public Health Service Act.  42 U.S.C. 299c-3(c).  That law requires that information collected for research conducted or supported by AHRQ that identifies individuals or establishments be used only for the purpose for which it was supplied.


Information that can directly identify the respondent, such as name and/or social security number will not be collected.


11. Questions of a Sensitive Nature

Surveys and interview tools have been designed without the need to ask questions of a sensitive nature. If any question is expressed to be of a sensitive nature during data collection, these questions will be modified to address the concern.

12. Estimates of Annualized Burden Hours and Costs


Exhibit 1.  Estimated annualized burden hours

Form Name

Number of respondents

Number of responses per respondent

Hours per response

Total burden hours

PROMIS items

180

1

0.25

45

System Usability Scale (SUS)

180

1

0.08

14.4

Readiness to Change Assessment

18

1

0.50

9

Process Usage Worksheet (Patients)

18

26

0.08

37.4

Process Usage Worksheet (Providers)

18

26

0.08

37.4

Patient Interviews

18

1

0.5

9

Provider Interviews

18

1

0.5

9

HIT Professional Interviews

18

1

0.5

9

Total

468



170.2

 


Exhibit 2. Estimated annualized cost burden

Form Name

Number of respondents

Total burden hours

Average hourly wage rate*

Total cost burden

PROMIS items

180

45

$24.34a

$1,095.30

System Usability Scale (SUS)

180

14.4

$24.34a

$350.50

Readiness to Change Assessment

18

9

$38.83b

$349.47

Process Usage Worksheet (Patients)

18

37.4

$38.83b

$1,452.24

Process Usage Worksheet (Providers)

18

37.4

$38.83b

$1,452.24

Patient Interviews

18

9

$24.34a

$219.06

Provider Interviews

18

9

$103.22c

$928.98

HIT Professional Interviews

18

9

$43.16d

$388.44

Total




$6,236.23

* National Compensation Survey: Occupational wages in the United States May 2017, “U.S. Department of Labor, Bureau of Labor Statistics.”

a Based on the mean wages for all occupations (00-0000)

b Based on the mean wages for all Healthcare Practitioners and Technical Occupations (29-0000)

c Based on the mean wages for Physicians and Surgeons (29-1060)

d Based on the mean wages for all Computer Occupations (15-1100)


13. Estimates of Annualized Respondent Capital and Maintenance Costs

There are no direct costs to respondents other than their time to participate in the study.


14. Estimates of Total and Annualized Cost to the Government

Exhibit 3a.  Estimated Total and Annualized Cost

Cost Component

Total Cost

Annualized Cost

Project Development

$40,296.20

$40,296.20

Data Collection Activities

$360,292.76

$360,292.76

Data Processing and Analysis

$25,960.75

$25,960.75

Publication of Results

$12,980.37

$12,980.37

Project Management

$27,978.00

$27,978.00

Overhead

$271,154.96

$271,154.96

Total

$738,663.52

$738,663.52


Exhibit 3b. Federal Government Personnel Cost

Activity

Federal Personnel

Hourly Rate

Estimated Hours

Cost

Data Collection Oversight

Health Scientist Administrator GS 14

$60.40

50

$3,020

Review of Results

Health Scientist Administrator GS 14

$60.40

150

$9,060

Annual salaries based on 2018 OPM Pay Schedule for Washington/DC area: https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2018/DCB_h.pdf


15. Changes in Hour Burden

This is a new collection of information.


16. Time Schedule, Publication and Analysis Plans

Once OMB approval is received, pilot testing is scheduled to begin in November 2018. A mixed methods approach will be used to evaluate testing of the modified app, with a focus on implementation processes, general evaluation of app usage, and post testing feedback from the three stakeholder groups. The timeline of scheduled tasks is provided below:


  1. Conduct Pilot Test for the OBERD app

    1. App Integration, Implementation (November 2018-January 2019)

    2. App Usage (January-June 2019)

    3. Evaluation (June-July 2019)

  2. Report Pilot Test Findings for the OBERD app

    1. Draft Pilot Test Findings Report (August 30, 2019)

    2. Expert Panel Review and Input on Findings (August 2019)

    3. Final Pilot Test Findings Report (September 30, 2019)

  3. Conduct Pilot Test for the challenge winning app

    1. Orientation of new app/Adapt pilot plan (February-March 2019)

    2. App integration, implementation (March-April 2019)

    3. App Usage (March-August 2019)

    4. Evaluation (August-September 2019)

  4. Report Pilot Test Findings for the challenge winning app

    1. Draft Pilot Test Findings Report (September-October 2019)

    2. Final Pilot Test Findings Report (September-October 2019)


The quantitative app usage data will be analyzed using appropriate descriptive statistical methods by our biostatistics researcher. The qualitative semi-structured interview data analysis will be led by Dr. Kellogg using a grounded theory approach to identify emerging themes in each response category for each stakeholder group.


17. Exemption for Display of Expiration Date

AHRQ does not seek this exemption.



List of Attachments:


Attachment A. PROMIS Item Bank V2.0


Attachment B: Readiness to Change Assessment


Attachment C: Process Usage Worksheet (Patients)


Attachment D: Process Usage Worksheet (Providers)


Attachment E: Patient Interview Guide


Attachment F: Provider Interview Guide


Attachment G: Health Information Technology Professional Interview Guide


Attachment H: System Usability Scale (SUS)


Attachment I: Training Materials







1 Acquadro C, Berzon R, Dubois D. Incorporating the Patient’s Perspective into Drug Development and Communication: An Ad Hoc Task Force Report of the Patient-Reported Outcomes (PRO) Harmonization Group Meeting at the Food and Drug Administration. Value Heal. 2003;6(5):522-531.


2 Weldring T, Smith S. Patient-Reported Outcomes (PROs) and Patient-Reported Outcome Measures (PROMs). Heal Serv Insights. 2013;6:61-68.


3 Snyder C, Jensen R, Segal J, Wu A. PATIENT-REPORTED OUTCOMES (PROs): PUTTING THE PATIENT PERSPECTIVE IN PATIENT-CENTERED OUTCOMES RESEARCH. Med Care. 2013;51:S73-S79.

4Turnbull AE, O'Connor CL, Lau B, Halpern SD, Needham DM  Allowing Physicians to Choose the Value of Compensation for Participation in a Web-Based Survey: Randomized Controlled Trial J Med Internet Res 2015;17(7):e189  URL: https://www.jmir.org/2015/7/e189   DOI: 10.2196/jmir.3898  PMID: 26223821   PMCID: 4705363


5Cook DA, Wittich CM, Daniels WL, West CP, Harris AM, Beebe TJ Incentive and Reminder Strategies to Improve Response Rate for Internet-Based Physician Surveys: A Randomized Experiment J Med Internet Res 2016;18(9):e244   URL: https://www.jmir.org/2016/9/e244 DOI: 10.2196/jmir.6318 PMID: 27637296  PMCID: 5045523


6Pit SW, Vo T, Pyakurel S. The effectiveness of recruitment strategies on general practitioner’s survey response rates – a systematic review. BMC Medical Research Methodology. 2014;14:76. doi:10.1186/1471-2288-14-76.

11


File Typeapplication/msword
File TitleOMB Clearance Application
Authorhamlin-ben
Last Modified BySYSTEM
File Modified2018-10-29
File Created2018-10-29

© 2024 OMB.report | Privacy Policy