Ssa Pcor Cds

SSA PCOR CDS.docx

Patient-Centered Outcomes Research Clinical Decision Support: Current State and Future Directions

OMB: 0935-0256

Document [docx]
Download: docx | pdf



SUPPORTING STATEMENT


Part A







Patient-Centered Outcomes Research Clinical Decision Support: Current State and Future Directions

AHRQ Project No.: HHSP233201500023I





Version: February 25, 2020







Agency of Healthcare Research and Quality (AHRQ)






A. Justification


1. Circumstances that make the collection of information necessary


The mission of the Agency for Healthcare Research and Quality (AHRQ) set out in its authorizing legislation, The Healthcare Research and Quality Act of 1999 (see http://www.ahrq.gov/hrqa99.pdf), is to enhance the quality, appropriateness, and effectiveness of health services, and access to such services, through the establishment of a broad base of scientific research and through the promotion of improvements in clinical and health systems practices, including the prevention of diseases and other health conditions. AHRQ shall promote health care quality improvement by conducting and supporting:


1. research that develops and presents scientific evidence regarding all aspects of health care; and


2. the synthesis and dissemination of available scientific evidence for use by patients, consumers, practitioners, providers, purchasers, policy makers, and educators; and


3. initiatives to advance private and public efforts to improve health care quality.


Also, AHRQ shall conduct and support research and evaluations, and support demonstration projects, with respect to (A) the delivery of health care in inner-city areas, and in rural areas (including frontier areas); and (B) health care for priority populations, which shall include (1) low-income groups, (2) minority groups, (3) women, (4) children, (5) the elderly, and (6) individuals with special health care needs, including individuals with disabilities and individuals who need chronic care or end-of-life health care.


Research has shown that health care quality in the U.S. varies significantly and only half of adults receive evidence-based, recommended care.1 Individuals with multiple chronic conditions (42% of adults) and older adults are at particular risk for negative health outcomes. 2,3 Current evidence shows that CDS systems improve adherence to evidence-based practices by analyzing patient data and making appropriate information available to the physician at the time they need it. CDS systems are usually electronic health record (EHR)-based, encompassing tools like alerts, clinical guidelines, patient reports and dashboards, diagnostic support, and workflow tools.4 These tools help reduce clinical errors and allow for customization to patient needs, improving quality of care and patient outcomes.5, 6, 7

The AHRQ PC CDS Learning Network (PC CDS LN) defines PC CDS as: “CDS that supports individual patients and their approved care givers and/or care teams in health-related decisions and actions by leveraging information from PCOR findings and/or patient-specific information (e.g. patient-generated health data).”8 Through PC CDS,9,10 AHRQ seeks to accelerate the movement of PCOR evidence into practice and to make CDS more shareable, standards-based, and publicly available. Traditionally, CDS initiatives have focused on provider-directed guidelines and increasing the shareability of CDS artifacts; however, PC CDS targets both patients (and/or caregivers) and providers.11

AHRQ’s effort to support patient-centered CDS (PC CDS) has included efforts such as the Patient-Centered (PC) CDS Learning Network, CDS Connect, and other related grants and contracts. In this project, AHRQ seeks to conduct a comprehensive evaluation to assess the impact of PCOR CDS Initiative to understand the current state of PC CDS and to identify gaps to guide AHRQ’s future research.


This research has the following goal:

1) to assess the accomplishments and opportunities for the PCOR CDS Initiative as a whole12, and each of its four individual components: the PC CDS Learning Network13, CDS Connect14, Quantifying Efficiencies15, and the U18 CDS Resource Grants.


To achieve the goals of this project the following data collections will be implemented:

  1. Key informant interviews with up to 147 individuals across all Initiative domains, conducted either by phone or in-person during site visits. The informants will be assigned to one of 14 interview protocols (see Attachments D-Q) based on which PCOR CDS Initiative component they worked on and their role on the project. The 14 protocols can be found in the following Attachments. :

    1. PCS CDS Learning Network - Leader

    2. PC CDS Learning Network – Governance/Non-Executive Steering Committee

    3. PC CDS Learning Network – Contributor

    4. CDS Connect – Leader

    5. CDS Connect – Contributor

    6. CDS Connect – Consumer/Patient

    7. CDS Connect – Participant

    8. Quantifying Efficiencies – Leader

    9. Quantifying Efficiencies – Informaticist

    10. Quantifying Efficiencies – Clinician

    11. PC CDS Projects – Site Leader

    12. PC CDS Projects – Informaticist

    13. PC CDS Projects – Clinician

    14. PC CDS Projects – Patient



  1. A web-based survey (see Attachment R) of approximately 453 “contributors” to and “consumers” of CDS Connect resources.


This study is being conducted by AHRQ through its contractor, NORC at the University of Chicago, pursuant to AHRQ’s statutory authority to conduct and support research on healthcare and on systems for the delivery of such care, including activities with respect to the quality, effectiveness, efficiency, appropriateness and value of healthcare services and with respect to quality measurement and improvement. 42 U.S.C. 299a(a)(1) and (2).


2. Purpose and Use of Information


This mixed methods evaluation seeks to answer the following research questions about the PCOR CDS Initiative (the Initiative) as a whole:

  1. To what extent has the PCOR CDS Initiative promoted the dissemination and implementation of PCOR findings through sharable, standards-based, and publicly available CDS and how? How does the Initiative fit into the broad, nationwide, multiplayer process of development and use of CDS to improve health?

  2. How did the activities and products carried out by each component (e.g., webinars, workgroups, in-person meetings, repositories, CDS artifacts and development tools, final reports or plans) contribute to the Initiative's operations and goals, and what factors facilitated or impeded the success of these products/activities?

  3. What do stakeholders perceive to be the impacts of the Initiative to date, including reflection on their own involvement in it, and current or potential achievements, such as the development of a common definition of PC CDS and growth of interest in and capacity for developing these types of CDS among stakeholders?

  4. How does the Initiative address federal policies for the dissemination and implementation of evidenced-based research funded by the PCOR Trust Fund, and how do they interact with other federal policy initiatives designed to promote widespread use, interoperability and patient access to information from EHRs with advanced CDS. 

  5. What can AHRQ learn from the CDS Initiative that is relevant to other initiatives aimed at disseminating and implementing clinical evidence and evidence-based practices? How can the lessons learned here inform future research, implementation, and dissemination initiatives?

Information collected by the study will inform strategies to promote the adoption of PCOR evidence into practice through CDS developed by AHRQ and other Department of Health and Human Services agencies, including the Centers for Medicaid and Medicare Services (CMS) and the Office of the National Coordinator for Health IT, as well as state and local governments and private health care organizations. Findings from the evaluation can help identify and shape strategies to promote more effective implementation of PCOR CDS in order to accelerate the movement of evidence into clinical practice and support patient-centered decision making by clinicians with their patients.

To achieve these goals, the evaluation team will use key informant interviews and a web-based survey to gather information about the programs from stakeholders, contributors, and users of the CDS Initiative programs.


Key Informant Interviews: The evaluation team will conduct semi-structured interviews with people involved in Initiative’s components, including representatives from academia, industry, health systems, and government. Key informants represent the following general groups:

  • Leaders: Includes AHRQ Project Officers, Contractor’s senior staff, and Senior Consultants to Initiative components. Leaders are expected to have set the direction of the components or activities and to be familiar with the activities, the processes of implementation, and their outputs in their entirety.

  • Contributors: Includes lead authors or content developers for a product or output of a component, may overlap with leaders. Example contributors from the PC CDS LN include lead authors of the Trust Framework, Opioid Action Plan, or Patient Blogs; examples from the CDS Connect include individuals who contributed CDS Artifacts to the repository.

  • Participants: Includes individuals who participated in workgroups of either the PC CDS LN or CDS Connect, or participated in the development of one of the products in the form of an advisory committee.

  • Consumers: Includes individuals who have used a product developed by the Initiative, including artifacts found on the CDS Connect repository and the CDS Connect Authoring Tool in particular. Individuals will be identified from interviews with leaders, contributors, and participants, and through literature review for authors making references to PCOR CDS Initiative products (i.e., reports or artifacts).

AHRQ and the evaluation contractor will create a list of eligible key informants that reflect the appropriate mix of roles and depth of experience to ensure comprehensive evaluation. Key informants will receive invitational emails that explain the scope and allow candidates to ask questions before declining or accepting the invitation. We will include clinical staff in our sample of participants in the Quantifying Efficiencies grant program, the U18 grants and the two opioid-related CDS projects. Involving staff at clinical sites will also be critical to understanding the value of PC CDS in the context of provider workflows and burdens

The primary research questions and topics covered in the interviews for each component are shown in Exhibit 1.

Exhibit 1. Primary Research Questions and Interview Topics by Component

Component


Primary Research Questions

Interview Topics

PC CDS LN

How has the Learning Network generated interest in the development, implementation, and dissemination of aspects of PC CDS, and what aspects?

Engaging stakeholders, discussions of activities to date and proposed strategic plans, dissemination, sustainability.

CDS Connect


What has CDS Connect done to help CDS artifacts become more shareable, interoperable, standards-based and more widely disseminated?

Infrastructure development, promotion, usability, accomplishments, challenges with use and areas for improvement, sustainability, lifecycle of CDS, standards

Quantifying Efficiencies Gained through Sharable CDS Support Resources

How have CDS Connect products addressed known barriers and facilitators to CDS incorporation and routine use in care delivery? How easy or hard is it to use CDS available on CDS Connect?

Perceived clinical needs and value of CDS, implementation experience, measuring impacts, perceived outcomes and impacts

CDS Development Projects (Up to 10)

  • 5 projects funded as of Sept. 30, 2019:

  • 2 U18 Grantees;

  • 2 Opioid-Related CDS;

  • 1 NORC Pilot

  • 5 more potential U18s projects may be awarded between 2020-2022.

How do CDS Connect resources and tools support implementation of shareable CDS resources?

How do new CDS projects meet Federal requirements such as the Medicare and Medicaid Promoting Interoperability Program and interoperability?

Types of CDS developed and purpose, stages of development, successes and challenges with development and implementation


Web Survey. The purpose of the web survey is to understand more about who the users of CDS Connect resources are, their reasons for using the resources, how they use these resources, and their perceptions about their value. The CDS Connect resources of interest include the CDS Authoring Tool, artifacts in the CDS Connect Repository and open-source CDS Connect resources available on Github, a platform for developing and sharing software. Respondents will be identified through a chain-referral methodology. The first set of survey invitations will be sent to a list of email addresses of known contributors or users of CDS Connect as well as a group of potential users of CDS Connect. At the end of the survey, each respondent will be asked to provide names and email addresses for up to four other users of CDS Connect resources. After the list of names from all referrals is deduplicated, a survey invitation will be sent to these referrals.

The survey instrument includes multiple choice questions that capture important data points about use of CDS Connect resources, specifically the CDS Authoring tool, GitHub resources, and artifacts from the CDS Repository. Respondents will only be presented with more detailed questions about CDS Connect resource usage based on their responses to initial screening questions. The survey will take ten minutes on average to complete based on in-house testing. Question domains for the survey instrument are shown in the Exhibit 2 below. The draft survey can be found in Attachment R.

Exhibit 2. CDS Connect Resources User Survey Question Domains

Domain

Description

Respondent Background

Questions about the respondent including their professional role, years of experience, organizational affiliation (optional) and participation in other collaborative CDS initiatives.

Awareness of CDS Connect Resources and PCOR CDS Initiative

Questions on whether non-users heard about the Repository and Authoring Tool prior to taking the survey, and whether all respondents had heard about the PCOR CDS Initiative and how (if they had heard about it). May also be used as a proxy for how they heard about CDS Connect.

Frequency of Use of CDS Connect Resources

Questions to assess frequency with which users visited the Repository or used the Authoring Tool to develop CQL logic.

Reasons for Using

Questions that assess why and how respondents used the resources, including whether they contributed, browsed, inspected or adapted artifacts and their purpose for using the Authoring Tool.

Reasons For Not Using CDS Connect Resources

Questions that assess awareness of CDS Connect Resources among non-users and reasons for not using the resources.

Issues with Using CDS Connect Resources and Technical Assistance

Questions about issues encountered when using CDS Connect Resources, and use of technical assistance.

Satisfaction with CDS Connect Resources

Questions that explore users’ satisfaction with CDS Connect Resources, their impressions of the value of the resources, and whether they would recommend it to others.

Intended or Actual Outcomes of Using CDS Connect Resources

Questions that assess if CDS knowledge developed through the use of CDS Connect resources (i.e., artifacts downloaded from Repository or CQL logic created in the Authoring Tool) were ultimately adapted as CDS tools for use in a local environment or planned to be used in a local environment. There are also questions about challenges encountered in adapting CDS artifacts from the Repository for local use. Further, there are questions about whether CDS is intended to be used as a patient-facing tool, clinician-facing, or both.



The findings from all components of the evaluation will be widely disseminated to federal, state and local policy makers, as well as private sector health care decision makers, via AHRQ’s National Resource Center for Health IT Website, the Office of the National Coordinator for Health IT Website, professional societies, peer-reviewed publications, e-mail alerts, and conference presentations.

3. Use of Improved Information Technology

Key Informant Interviews The key informant interviews will be semi-structured interviews conducted in-person or by telephone with study respondents. Because most interview questions are open-ended to allow for in-depth exploration of issues, electronic submission of responses is not a viable option.

Web Surveys In order to minimize respondent burden and to permit the electronic submission of survey responses, the Survey of CDS Connect users will be web-based and deployed using a well-designed, low burden, and respondent-friendly survey administration process and instruments. The survey will include programmed logic to allow for the presentation of only relevant questions based on respondent’s answers to initial survey questions.

4. Efforts to Identify Duplication

This is the first evaluation of the projects in the PCOR CDS Initiative. The individual projects may have generated reports summarizing results and lessons learned, which will be used as a source of material for this evaluation. However, there has not been a previous effort to examine the results across all of the separate projects.

5. Involvement of Small Entities

The information collected may involve small entities, as some of the participating health care systems may involve smaller units. For this project, only items that provide critical information for conducting the evaluation will be included, and the information being requested has been held to the absolute minimum required for the intended use.

6. Consequences if Information Collected Less Frequently

This is a one-time collection.

7. Special Circumstances

This request is consistent with the general information collection guidelines of 5 CFR 1320.5(d)(2). No special circumstances apply.

8. Federal Register Notice and Outside Consultations


8.a. Federal Register Notice

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), AHRQ published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on March 25, 2020, Volume 85, Number 58, page 16943, and provided a sixty-day period for public comment. A copy of this notice is attached as Attachment T. During the notice and comment period, the government received no requests for information or substantive comments.

8.b. Outside Consultations

AHRQ and its evaluation contractor, NORC at the University of Chicago, is consulting with a technical expert panel (TEP) for guidance on the plan and design for this project. The TEP will be made up of CDS experts who are best able to articulate the optimum approach to conducting the evaluation, Horizon Scan, pilot project, and dissemination of project findings. (See Attachment A for list of TEP members.) The first TEP was held in person in February 3, 2020. During this meeting, NORC gathered feedback on the overall approach to data collection and analysis. The second in-person TEP meeting is tentatively scheduled for September 2020.


In addition, NORC is consulting with the following experts in CDS implementation to design the evaluation:

  • Dr. Dean Sittig, Professor of Biomedical Informatics and Bioengineering; University of Texas Health Science Center at Houston

  • Dr. Aziz Boxwala, Co-Founder and President of Elimu Informatics.

9. Payments/Gifts to Respondents

There will be no remuneration to respondents.

10. Assurance of Confidentiality

For both the key informant interviews and the web survey, individuals and organizations will be assured of the confidentiality of their replies under Section 934(c) of the Public Health Service Act, 42 USC 299c-3(c). They will be told the purposes for which the information is collected and that, in accordance with this statute, any identifiable information about them will not be used or disclosed for any other purpose without their prior consent.


Key informant interviews. During the key informant interviews and site visits, NORC will collect the respondent’s name, phone number, organizational affiliation, and title for case tracking purposes or for clarification call backs. They will not collect any other information about either the respondent or any individual in the establishment, outside of their role on the project. All electronic files will be password protected and accessible only from a secured network. When not in use by project staff, all printed information or materials that could potentially identify participants in the study will be stored in locked cabinets that are accessible only to project team members.


All respondent involvement will be voluntary. Oral consent for participation will be obtained from respondents. Respondents will be informed that: (1) copies of the interview notes will not be shared with anyone outside of the team; (2) respondent comments may be included in reports and publications but will not be attributed to specific individuals or organizations; and (3) the interviewers have a system to mark specific comments in interview notes as off-limits for reports and publications when notified to do so by the respondent.


Web survey. Web survey respondents will be informed that their participation is voluntary. The survey will include a statement of confidentiality containing the following statement:

The confidentiality of your responses are protected by Sections 934(c) and 308(d) of the Public Health Service Act [42 U.S.C. 299c-3(c) and 42 U.S.C. 242m(d)]. Information that could identify you will not be disclosed unless you have consented to that disclosure.

The survey will collect the respondent’s name, title and organizational affiliation for purposes of understanding their role in using CDS. It will also collect the name and email of other users of CDS resources known to the respondent (referrals.) Initial respondents will be asked for permission to share their name in the survey invitations sent to the individuals they refer. Their name will not be linked to the data for any other purpose.

All data will be collected by AHRQ’s contractor, NORC. All facility and respondent-level data, as well as survey response data, will be stored on NORC’s secure servers.

11. Questions of a Sensitive Nature

No questions of a sensitive nature will be asked. Further, during the introduction to the interview, respondents will be informed that their participation is voluntary and that they can refuse to answer any question. The web survey will be programmed so that any question can be skipped if they do not apply to the respondent.


12. Estimates of Annualized Burden Hours and Costs


Key Informant Interviews. Key informant interviews will be conducted with up to 147 key informants across a variety of organizations involved in each component of the Initiative. NORC will use one of 14 interview protocols based on the component the key informant is involved in and their role in that component. As shown in Exhibit 3, the interview forms names include the type of role of the key informant in the project. Attachment B includes the interview recruitment materials and thank you e-mail template, and Attachment C includes the information sheets for all informant types. Attachments D through Q include the interview protocols for each informant type as noted in Exhibit 3. All interviews are expected to last one hour. Some key informants may serve multiple roles or work on multiple projects. In these cases, the relevant protocols will be combined and streamlined so that the informant only completes one interview. Some of the key informant interviews for the sites or Opioid-related grants may be conducted during the course of site visits at the implementation sites, either with individuals or small groups of respondents.


Web Survey. For the web survey, it is estimated that 453 CDS Connect users will respond to the 10-minute survey. (See Attachment R for the survey instrument and Attachment S for the survey invitations.)


The total annual burden hours for the key informant interviews and surveys is estimated to be 224 hours, as shown in Exhibit 3.

Exhibit 3. Estimated Annualized Burden Hours

Form Name

Number of respondents

Hours per response

Total burden hours

Attachment D: PC CDS Learning Network - Leader

7

1

7

Attachment E: PC CDS Learning Network –Steering Committee

3

1

3

Attachment F: PC CDS Learning Network - Contributor

8

1

8

Attachment G: CDS Connect – Leader

5

1

5

Attachment H: CDS Connect – Contributor

20

1

20

Attachment I: CDS Connect – Consumer

25

1

25

Attachment J: CDS Connect – Participant

10

1

10

Attachment K: Quantifying Efficiencies - Leader

5

1

5

Attachment L: Quantifying Efficiencies – Informaticist

4

1

4

Attachment M: Quantifying Efficiencies - Clinician

8

1

8

Attachment N: PC CDS Projects –Site Leader

18

1

18

Attachment O: PC CDS Projects – Informaticist

10

1

10

Attachment P: PC CDS Projects - Clinician

20

1

20

Attachment Q: PC CDS Projects - Patient

4

1

4

Attachment R: Web Survey of CDS Connect Users

453

.17

77

Total

600

 

224


Exhibit 4 shows the estimated annual cost burden associated with the respondents' time to participate in this information collection, which comes to $14,371.86.


Exhibit 4. Estimated Annualized Cost Burden

Form name

Number of interviews

Total

burden hours

Average hourly wage rate**

Total

cost burden

Attachment D: PC CDS Learning Network - Leader

7

7

$59.541

$416.78

Attachment E: PC CDS Learning Network –Steering Committee

3

3

$59.541

$178.62

Attachment F: PC CDS Learning Network - Contributor

8

8

$59.541

$476.33

Attachment G: CDS Connect – Leader

5

5

$59.541

$297.71

Attachment H: CDS Connect – Contributor

20

20

$59.541

$1,190.82

Attachment I: CDS Connect – Consumer

25

25

$59.541

$1,488.53

Attachment J: CDS Connect – Participant

10

10

$59.541

$595.41

Attachment K: Quantifying Efficiencies - Leader

5

5

$59.541

$297.71

Attachment L: Quantifying Efficiencies – Informaticist

4

4

$59.541

$238.16

Attachment M: Quantifying Efficiencies - Clinician

8

8

$101.432

$811.46

Attachment N: PC CDS Projects –Site Leader

18

18

$59.541

$1,071.74

Attachment O: PC CDS Projects – Informaticist

10

10

$59.54

$595.40

Attachment P: PC CDS Projects - Clinician

20

20

$101.43

$2,028.60

Attachment Q: PC CDS Projects - Patient

4

4

$24.983

$99.93

Attachment R: Web Survey of CDS Connect Users

453

77

$59.541

$4,584.66

Total

600

224


$14,371.86


**Wage rates were calculated using the mean hourly wage from the U.S. Department of Labor, Bureau of Labor Statistics, May 2018 National Occupational Employment and Wage Estimates for the United States, https://www.bls.gov/oes/current/oes_nat.htm

1 Average rate for Computer Information and Research Scientists

2 Average rate for Physicians and Surgeons

3 Average rate for All Occupations


13. Estimates of Annualized Respondent Capital and Maintenance Costs

There are no direct costs to respondents other than their time to participate in the study.

14. Estimates of Total and Annualized Cost to the Government


The estimated total cost to the Federal Government for this project is $1,278,193 over a three-year period from September 30, 2019 to September 29, 2022. The estimated average annual cost is $426,604. Exhibit 5a provides a breakdown of the estimated total and average annual costs by category.

Exhibit 5a. Estimated Total and Annualized Cost

Cost Component

Total Cost

Annualized Cost

Project Development

$424,715

$141,572

Data Collection Activities

$510,340

$170,113

Data Processing and Analysis

$238,698

$79,566

Project Management

$104,440

$34,813

Total

$1,278,193

$426,064



The estimated annual cost for AHRQ oversight of the evaluation is shown in Exhibit 5b.

Exhibit 5b. Federal Government Personnel Cost

Activity

Federal Personnel

Annual Salary

% of Time

Cost

Management Support: GS 12, Step 2 average

1

$89,213

10%

$8,921.8

Subject Matter Expert support: GS-15, Step X average

3

$161,730

2.0%

$9,703.8

Total

$18,625.6

Annual salaries based on 2020 OPM Pay Schedule for Washington/DC area: https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2020/DCB.pdf

15. Changes in Hour Burden

This is a new collection of information.

16. Time Schedule, Publication and Analysis Plans

Time schedule and publication plans. The anticipated schedule for this project is shown in Exhibit 7. Once clearance from the Office of Management and Budget is obtained, AHRQ will begin identifying appropriate respondents and scheduling and conducting interviews.


Exhibit 6. Anticipated Schedule


Activity

Length

Estimated timeline following OMB clearance

Conduct and Analyze Key Informant Interviews

CDS Learning Network:

6 weeks

Month 1-2

CDS Connect:

9 weeks

Month 2-3

Quantifying Efficiencies Gained:

5 weeks

Month 3-4

U18 Grants (up to 7)

9 weeks

Months 3-4

Qualitative data analysis of key informant interviews

15 weeks

Months 2-5

Conduct and Analyze Site Visits

Conduct site visits for Medstar

1 week

Month 2

Conduct up to seven site visits for U18 projects

12 weeks

Months 10-12

Qualitative analysis of site visit data

4 weeks

Month 13

Conduct and Analyze CDS Connect Web Survey



Preparation for data collection

15 weeks

Months 1-4

Data Collection

12 weeks

Months 5-7

Quantitative analysis of survey data

10 weeks

Months 8-10

Year 1 Evaluation Report

20 weeks

Months 10-14

Year 2 Evaluation Report

20 weeks

Months 23-27

Final Evaluation Report

24 weeks

Months 31-35



Analysis plans.

Key informant interviews The evaluation team will systematically code transcripts of interviews to identify factors that contribute to successes or challenges of the PCOR CDS Initiative as a whole and along with those of its components.


The team will develop a codebook of existing (from the interview guide) and emergent (from responses) themes, and code interview notes and transcripts within NVivo software. Codes will also correspond to concepts identified from the peer review and grey literature, including domains from the Analytic Framework Action developed by the PC CDS LN. To ensure high-quality analysis, team members will flag coding ambiguities and develop new codes as needed.


Coded data will be used to develop narratives that answer the research questions for particular components and the Initiative as a whole. Analysis of findings within codes will reveal similarities and differences in the perspectives of key informants, as well as the range of opinions and experiences on a given topic. Analysis of the relationship between codes or among a combination of codes will examine the interrelationship between themes or concepts. The quantitative analysis will include descriptive statistics for the program measures.


Web survey. The evaluation team will use univariate descriptive statistics to analyze the CDS Connect user survey results. For example, we will calculate the percentage of respondents who used each resource (e.g., authoring tool, artifact repository, and Github resources), extent of their usage of these resources, their perceptions about how valuable they find the resources and the barriers to using the resources. We will calculate top-box responses for questions asking about satisfaction, likelihood to recommend services, etc. We will compare characteristics of seed respondents versus referral respondents to better understand the composition and diversity of the sample. We will also assess differences in responses to CDS Connect user survey questions between seed and referral respondents. In addition, the evaluation will include cross-tabulations and bivariate analysis to compare respondent characteristics and extent of reported use of each of the CDS Connect resources including respondent’s organizational type, role, years of experience in CDS development, and participation in CDS Connect initiative activities.


17. Exemption for Display of Expiration Date

AHRQ does not seek this exemption.

List of Attachments:

Attachment A - Technical Expert Panel Members

Attachment B - Key Informant Interview Recruitment Materials

Attachment C - Key Informant Interview Information Sheet

Attachment D: PC CDS Learning Network - Leader

Attachment E: PC CDS Learning Network –Steering Committee

Attachment F: PC CDS Learning Network - Contributor

Attachment G: CDS Connect – Leader

Attachment H: CDS Connect – Contributor

Attachment I: CDS Connect – Consumer

Attachment J: CDS Connect – Participant

Attachment K: Quantifying Efficiencies - Leader

Attachment L: Quantifying Efficiencies – Informaticist

Attachment M: Quantifying Efficiencies - Clinician

Attachment N: PC CDS Projects –Site Leader

Attachment O: PC CDS Projects – Informaticist

Attachment P: PC CDS Projects - Clinician

Attachment Q: PC CDS Projects - Patient

Attachment R: Web Survey of CDS Connect Users

Attachment S - CDS Connect Web Survey Recruitment Invitations and Follow-up

Attachment T - Federal Register Notice


References

1 McGlynn, E. A., Asch, S. M., Adams, J., Keesey, J., Hicks, J., DeCristofaro, A., & Kerr, E. A. (2003). The quality of health care delivered to adults in the United States. New England journal of medicine, 348(26), 2635-2645.


2 Buttorff, C., Ruder, T., & Bauman, M. (2017). Multiple chronic conditions in the United States. Santa Monica (CA): Rand Corporation.


3 Melnyk, B. (2016). Widespread adoption of evidence-based practices is essential for a growing Medicare population. The American journal of managed care, 22(7 Spec No.), SP262-SP263


4 Osheroff, J. A., Teich, J. M., Middleton, B., Steen, E. B., Wright, A., & Detmer, D. E. (2007). A roadmap for national action on clinical decision support. Journal of the American medical informatics association, 14(2), 141-145.

5 Garg, A. X., Adhikari, N. K., McDonald, H., Rosas-Arellano, M. P., Devereaux, P. J., Beyene, J., ... & Haynes, R. B. (2005). Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. Jama, 293(10), 1223-1238.


6 Wolfstadt, J. I., Gurwitz, J. H., Field, T. S., Lee, M., Kalkar, S., Wu, W., & Rochon, P. A. (2008). The effect of computerized physician order entry with clinical decision support on the rates of adverse drug events: a systematic review. Journal of general internal medicine, 23(4), 451-458.


7 Musen, M. A., Middleton, B., & Greenes, R. A. (2014). Clinical decision-support systems. Biomedical informatics (pp. 643-674). Springer, London.


8 Patient-Centered Clinical Decision Support Learning Network. Retrieved from: https://pccds-ln.org/

10 Implementation and Evaluation of New Health Information Technology (IT) Strategies for Collecting and Using Patient-Reported Outcome (PRO) Measures (U18). (2017). Retrieved from https://grants.nih.gov/grants/guide/pa-files/PA-17-247.html


11 Marcial, L. H., Richardson, J. E., Lasater, B., Middleton, B., Osheroff, J. A., Kawamoto, K., & Blumenfeld, B. H. (2018). The Imperative for Patient-Centered Clinical Decision Support. eGEMs, 6(1).


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB Clearance Application
Authorhamlin-ben
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy