9 Interview Protocol: Implementation Experience

Implementation and Testing of Diagnostic Safety Resources

Attachment J - Implementation Interviews Protocol.2024_5_24

OMB:

Document [docx]
Download: docx | pdf

Attachment J - Implementation Interviews Protocol



Shape1

Form Approved
OMB No. 0935-XXXX
Exp.
Date XX/XX/20XX





Interview Protocol: Implementation Experience







Overview: This Interview Protocol establishes a master list of interview questions that can be used across qualitative data collection protocols for each of the three tools. Not all questions will need to be asked in all interviews, and not all probes will be needed during the interviews. Probes are designed to obtain complete and appropriate information for each of the tools and include generic prompts, such as “can you provide an example?”, “what are other examples to describe this?”, “tell me a little bit about more about that” as well as more specific prompts tailored to the tool.

Time: Interview sessions will last 30-60 minutes based on respondent(s) availability and the protocol cut as needed to reflect available time.

Respondent(s): Tool implementers, site leads, and site champions as appropriate by tool and site. These may be clinicians, healthcare organization administrators/leaders, or other healthcare professionals. Depending on the tool and how it is implemented, we may conduct one-on-one interviews, but we also may use this protocol to conduct small group interviews with multiple individuals involved in implementation for efficiency.

  • Measure Dx

  • Calibrate Dx

  • Toolkit for Engaging Patients

Shape2

This survey is authorized under 42 U.S.C. 299a. This information collection is voluntary and the confidentiality of your responses to this survey is protected by Sections 944(c) and 308(d) of the Public Health Service Act [42 U.S.C. 299c-3(c) and 42 U.S.C. 242m(d)]. Information that could identify you will not be disclosed unless you have consented to that disclosure. Public reporting burden for this collection of information is estimated to average 60 minutes per response, the estimated time required to complete the survey. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The data you provide will help AHRQ’s mission to produce evidence to make health care safer, higher quality, more accessible, equitable, and affordable. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to: AHRQ Reports Clearance Officer Attention: PRA, Paperwork Reduction Project (0935-xxxx) AHRQ, 5600 Fishers Lane, Room #07W42, Rockville, MD 20857, or by email to the AHRQ MEPS Project Director at [email protected].





Introduction and consent:

Thank you for agreeing to participate in today’s interview. Your participation is very important to us. I’m [name] from the RAND Corporation and I’m joined by our notetaker, [name]. We appreciate the opportunity to understand your experience with [tool name].

This survey is authorized under 42 U.S.C. 299a. Your answers are voluntary, and the interview is expected to take about 60 minutes to complete. It has been approved for use under OMB Number 0935-XXXX. We could not conduct this survey without that authorization. We will protect your privacy to the extent allowed by law. [IF RESPONDENT ASKS ABOUT PRA, READ PRA STATEMENT].

Before we begin, I want to give some information about the interview.

  • The interview will take 60 minutes or less.

  • Your participation in this interview is completely voluntary.

  • You can stop the interview at any time.

  • If there is a question you don’t want to answer, just tell me and we’ll move on to the next one.

  • We will not link anything you say here to your name or other identifiable information.

  • I am going to audio record our conversation to help me remember what you say and with our notetaking. I’ll destroy the recording once we finalize our notes.


Do you have any questions about this project or interview?

Do you agree to take part in this interview?

Do you agree to record the interview? IF YES: Ok great. Let me go ahead and start our recording.


INSTRUCTIONS If Group interview:

As you enter the meeting, can you put into the chat: a) how long you have been at this site, and b) your role(s) in implementing [tool] at your site?

Overall Experience and Use of Tool

To start, we would like to hear from your generally about how [tool] has been implemented at your site.

  1. Can you tell us a little bit about how [tool] has been implemented at your site?

    1. Probe for specific examples

[SKIP Q2 for Measure Dx tool]

  1. How consistently was the tool used?

    1. Can you give me some examples of how the tool was used consistently (or not)?



  1. What types of changes or adaptations to your existing processes or practices did you need to make to implement [tool] at your site?

  1. [For Measure Dx tool] Probe: What helped in implementing the chosen measurement strategy?

  2. [For Measure Dx tool] Probe: What hindered the review and use of diagnostic safety event data?





  1. Let’s talk about the training that was part of implementing [tool] at your health system. What would have been helpful to have as part of the training for using the [tool]?

    1. For those that attended, what would you improve about the training?

    2. Probe for modality of training, in addition to content

    3. [For Measure Dx tool] Probe: What from the training was shared with the full team?

Barriers and Facilitators to Implementation

Next, we would like to hear more specifically about what challenges you encountered during implementation and what you found to be helpful in implementing [tool].

  1. What has been challenging about implementing [tool]?

    1. [For all Tools]: Probe for general challenges: lack of time, lack of leadership support, additional resources needed, hard to change culture.

    2. [For Measure Dx tool] What has been challenging about implementing the measurement strategy? What about data measurement?

    3. [For Measure Dx tool] Probe: [Tool] has multiple steps with the first step of implementing a measurement strategy for diagnostic safety events. What are the key elements laid out in [tool]’s processes that you were able to implement? Which ones were not feasible to implement?

    4. [For Measure Dx tool] Probe: What challenges have you encountered using or reviewing the data? Data transparency? Data reporting? Or learning from the data?



  1. Did you implement any modifications to [tool]? If so, what were the reasons or concerns?



  1. What has been helpful in implementing [tool]? What strategies have you found to be helpful in implementing [tool]?

  1. [For Measure Dx tool] How helpful were the [tool]’s materials in setting up the new measurement strategy? How did you use the information?

  2. [For Measure Dx tool] What would you say were the main factors that facilitated the review of diagnostic safety data and identifying learnings from that data?

  3. [For Measure Dx tool] Did the [tool]’s materials influence your diagnostic safety team in how you gathered insights from the data or how you made decisions about what processes and practices to improve?



  1. For what types of situations has this tool been easier to use? In other words, are there instances (specific types of visits or types of patients) for which it is easier to implement this approach than other?

    1. Probe for examples of different situations when this tool may be easier (or more challenging) to use?



Utility of Tool



  1. How would you describe the usability of the [tool]’s materials, such as readability, clarity, etc?

    1. What supports or supplements does it require?

    2. What would have helped in its implementation?


  1. What would you say has been the biggest finding or insight you or your team has learned from implementing [tool] at your site?

    1. [For Measure Dx tool] Probe: What have been the main insights your team has found from the data collection strategy your site chose for measuring diagnostic safety events?

    2. [For Measure Dx tool] Probe: What data did your team use specifically to arrive at these insights? See data listed in Measure Dx for reference, listed in Appendix A.

    3. [For Measure Dx tool] Probe: Were the case examples in Measure Dx used at all? If so, how?

    4. [For Measure Dx tool] Probe: What review tools did you use to identify these insights? See review tool listed in Measure Dx for reference, listed in Appendix A.



  1. In what situations have you learned something new or that you would not otherwise have known from using this [tool] or approach?

    1. Probe for specific examples, situations



  1. [For Measure Dx ONLY] What did you learn that you cannot act on? What made this infeasible? What would you have needed to push this forward?



  1. What has been unexpected or surprising, if anything, to you about using this [tool]?

    1. Probe on cost, engagement of clinicians, reluctance to implement, legal concerns.

    2. [For Measure Dx tool] Probe: What issues or areas of improvement were needed to improve diagnostic safety that were surprising?] Were you surprised by the value of any of the specific data in detecting diagnostic errors?

    3. [For Toolkit for Engaging Patients] Probe: What did you learn about from patients or about the visit with patients that was surprising to you when using the tool?



  1. What, if any, negative consequences have you noticed in using this [tool]?

    1. Probe for specific examples such as unintended consequences, legal issues



  1. [For Calibrate Dx Only] What legal implications, if any, have arisen as a result of using Calibrate Dx?



Acceptance



We would like to hear now a little about how the [tool] was accepted at your institution.

  1. How widely was the [tool] accepted by those using the tool? Across leaders at your institution? Across the QI team at your institution?

    1. Probe for differences in levels acceptance across individuals. Probe for any engagement strategies utilized.



  1. What are some of the primary reasons you feel the [tool] was (or not) accepted at your institution?

    1. Probe for specific examples

Sustainability and Maintenance

  1. How likely is it that you will continue to use this [tool] moving forward?

    1. What might need to change, if anything, to continue using this [tool] in the future?



  1. In thinking about moving forward towards the future, what have you considered doing to ensure sustainment of this tool in practice?

    1. Probe for sustainment of any tool-specific strategies, activities, and teams implemented during testing period.

    2. Probe for sustainment of staff/team and stakeholder engagement activities

    3. Probe for what would be helpful in sustaining the use of [tool]?

    4. Probe for issues related to training, implementation, resources

Closing

  1. Is there anything else related to [tool] that you would like to share that we did not discuss today?

Thank you for your time in speaking with us today.





Appendix A: List of Data and Review Tools Listed in Measure Dx



This is the list of data tools listed in Measure Dx as reference:


  1. Case referrals to risk management by one or more sources

    • Clinicians and staff

    • Patient experience/patient advocacy departments

    • Legal/compliance and regulatory/accreditation teams

    • Patients or families

  2. Morbidity and mortality conferences

    • May be a source of cases but may also be an output/action in response to a case review

  3. Serious safety events and incident reports

    • Safety event/root cause analysis reports

    • Risk management (at some organizations, events may be called “claims” even if not identified in litigation)

  4. Resolved malpractice claims

    • Risk management

    • Aggregate data from insurers

  5. Hospital acquired conditions data

    • Records of preceding care (evaluate to detect delayed/missed diagnosis)

  6. Autopsy cases

    • Underused data sources that reveal useful patterns of diagnostic discrepancies

  7. Institution- or clinic-wide QI/safety initiatives

    • Mortality reviews (often completed independent of/prior to autopsy)

    • Diagnosis specific (e.g., sepsis, cancer)

    • Reviews of unexpected admissions, transfers to intensive care, codes, rapid response

    • Department-specific review processes such as ED or primary care case review (e.g., unexpected return to ED)

    • Radiology discrepancies and internal lab QI/safety reviews

  8. Peer-review data

    • Formal or informal

  9. Ongoing or focused professional practice evaluation


This is the list of review tools listed in Measure Dx as reference:

  1. Revised SaferDx Instrument

  2. Diagnostic Error Evaluation Research (DEER) taxonomy

  3. Modified fishbone diagram for diagnostic errors

  4. Common Formats for Event Reporting – Diagnostic Safety (CFER-DS)

13


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJulia Bandini
File Modified0000-00-00
File Created2024-08-05

© 2025 OMB.report | Privacy Policy