5 2014 Stakeholder Interview Guide

Evaluation of the Children's Health Insurance Program Reauthorization Act of 2009 (CHIPRA) Quality Demonstration Grant Program

Attachment H -2014 Stakeholder Interview Guide

Stakeholder Interviews

OMB: 0935-0190

Document [docx]
Download: docx | pdf






Attachment H

2014 Stakeholder interview guidE


Form Approved

OMB No:

Exp. Date






CHIPRA QUALITY DEMONSTRATION EValuation

2014 Interview guide


EXTERNAL STAKEHOLDER




Thank you for speaking with us today. In the email we sent confirming this interview, we provided information on who we are, why we’re here, what topics we’re interested in talking about, and we assured you that your responses will be kept confidential. Do you have any questions before we start the interview? If not, may we begin recording the conversation?



If the respondent did not receive or does not remember the confirmation email or if they have questions about the information provided in the email, review the introduction to the study on next page.


Public reporting burden for this collection of information is estimated to average 60 minutes per response, the estimated time to complete the interview. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. Send comments regarding this burden estimate or any other aspects of this collection of information, including suggestions for reducing this burden, to: AHRQ Reports Clearance Officer Attn: PRA, Paperwork Reduction Project (X) AHRQ, 540 Gaither Rd., Rm. 5036, Rockville MD 20850.


Introduction to Study


We are from the Urban Institute and Mathematica Policy Research, and we are part of the team conducting the cross-state evaluation of the CHIPRA Quality Demonstration. The evaluation is federally funded by the Agency for Healthcare Research and Quality (AHRQ). The Centers for Medicare & Medicaid Services (CMS) provides the grants to states.


We are now conducting our second round of visits to the 18 demonstration states. We are meeting with people who are closely involved in the design, management, and day-to-day operations of each state’s demonstration initiatives, as well as other people who care about how the demonstrations affect children’s care quality in Medicaid and CHIP.


We are particularly interested in your thoughts and insights on several topics, including:


  1. The demonstration’s goals and activities and how those activities are interacting with other initiatives in the state.


  1. Evidence that the demonstration may be having the desired effect


  1. Your involvement with the demonstration’s activities


  1. The sustainability and spread of demonstration activities


  1. Major changes in the state in the last year that may have impacted the demonstration.


Our interview will take an hour. Your responses will be kept confidential to the extent permitted by law, including AHRQ’s confidentiality statute, 42 USC 299c-3(c). Only evaluation team members will have access to your responses. We will report to AHRQ and CMS on the general themes that emerge from all of our discussions. Some reports may list the people we spoke to in a state, but we will not attribute specific comments or quotes to named individuals without permission. We would like to record our discussion in case we miss something in our notes and want to go back and listen. But, we do not plan to transcribe the recording.


Do you have any questions before we start? May I begin recording?



I. Introduction/Background


My first questions are for background.


  1. Please tell me what your position is at here at [this organization] and how long you have worked here.

  2. What are your responsibilities in connection with the state’s CHIPRA Quality Demonstration grant? Are you directly involved in some grant categories, but not others? Which ones?


II. Strategies


My next questions are about the CHIPRA Quality Demonstration in [this state].


[NOTE: Use the cross-strategy module if respondent has general knowledge. Use a category-specific module if respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]


Cross-Category


  1. Are you familiar with the state’s major strategies or approaches for CHIPRA for the last year? If so, how would you describe them?

    • How has their approach evolved overtime?

  2. How would you characterize the state’s progress over the last year?

    • Why would you say that?

  3. What strategies seem to have worked well? What factors seem to be contributing to progress?

  • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did the state work to overcome the challenges you’ve described?

  • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. Has your organization done anything differently to improve children’s care quality as a result of the CHIPRA demonstration?

    • Probe for examples.

  2. Now, we would like to discuss how other groups are responding to the demonstration. [Probe on groups as appropriate]

    • How are providers responding to the demonstration?

      • Favorably, unfavorably, neutral or not aware?

      • What changes are they making to how they deliver care? What information do you have to assess these changes?

    • How are MCOs responding to the demonstration?

      • Favorably, unfavorably, neutral or not aware?

      • What changes are they making as a result of the demonstration? Probe on changes to provider networks and provider payment methodologies. What information do you have to assess these changes?

    • How are patients and families responding to the demonstration?

      • Favorably, unfavorably, neutral or not aware?

      • What kind of feedback are they giving on the demonstration? How did you gather this feedback?

    • How are any other stakeholders, such as hospitals or consumer advocates, responding to the demonstration?

      • Favorably, unfavorably, neutral or not aware?

      • What changes are they making at their organization? What information do you have to assess these changes?

  1. Do you think the state will be able to sustain the CHIPRA activities after the grant period ends? Why or why not?

    • What will help sustain their efforts?

    • What activities will be difficult to sustain? Why?

  1. Are you aware of plans to spread or expand the CHIPRA activities?
















Quality Measurement Initiatives (Category A)


[NOTE: Use category A module if state has a category A project and the respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]


The next few questions will focus on Category A, collecting and reporting quality measures. We are interested in the state’s efforts to report the initial and new CHIPRA core measures and other measures developed by the State. When talking about the measures, please specify if they are being reported at the state, health plan, or provider level.


  1. Please describe how you and your organization have worked with the demonstration project staff to implement activities related to quality measurement in the last year.

    • Probe as needed on involvement in workgroups, provider selection and recruitment, learning collaboratives, development of resources or tools.

  2. Please elaborate on the major strategies or approaches the state used to meet their objectives or milestones in this area for the last year.

    • Probe on the initial and new core set and measures developed by the state.

    • Probe on efforts to report at the state, plan, and practice level and efforts to publicly report measures.

    • How has their approach evolved overtime?

  1. How would you characterize their progress over the last year?

    • Are there any milestones they reached?

    • Which, if any, have not been reached on schedule or as planned?

  1. What strategies seem to have worked well? What factors seem to be contributing to progress?

    • Probe on the initial and new core set and measures developed by the state.

    • Probe on efforts to report at the state, plan, and practice level and efforts to publicly report measures.

  2. What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did the state work to overcome the challenges you’ve described?

    • Probe on the initial and new core set and measures developed by the state.

    • Probe on efforts to report at the state, plan, and practice level and efforts to publicly report measures.

  3. Which measures were relatively easy to collect? Why?



  1. Which measures were more difficult? Why?



  1. Were changes to the CMS core set and measure specifications helpful?

    • Were the changes clear?

    • Were they responsive to your needs?

  2. Is your organization using the measures? How?

    • Probe for examples.

  3. How is the state using the measures?

    • To support other parts of the demonstration?

  • To prepare reports? If so, please describe. Probe as needed on report audience, contents, and reporting frequency.

  • To support changes to payment structures or as a pay for performance measure? If so, please describe. Probe as needed on selection of core measures, weighting of measures, and provider/plan response.

    • To support other agency initiatives?

  1. Now, we would like to discuss how other stakeholder groups responded to the quality measures and reports. [Probe on new proposed CHIPRA quality measures and new measures developed by the state. Probe for groups who will collect or receive measure data]

    • How are providers responding to the quality measures and reports?

      • Favorably, unfavorably, neutral or not aware?

      • How engaged were they in data collection?

      • How are they using the measures and reports? To support other parts of the demonstration? What information do you have to assess their use?

    • How are MCOs responding to the quality measures and reports?

      • Favorably, unfavorably, neutral or not aware?

      • How engaged were they in data collection?

      • How are they using the measures and reports? What information do you have to assess their use?



    • How are patients and families responding to the quality measures and reports?

      • Favorably, unfavorably, neutral or not aware?

      • What kind of feedback do you hear from families? How did you gather this feedback?

      • How are they using the data to guide their decisions? What information do you have to assess their use?

    • How are other stakeholders, such as hospitals or consumer advocates, responding to the quality measures and reports?

      • Favorably, unfavorably, neutral or not aware?

      • How are they using the measures and reports at their organization? What information do you have to assess their use?

  1. Do you think the state will be able to sustain the CHIPRA quality measurement and reporting? Why or why not?

    • What will help sustain their efforts?

    • What activities will be difficult to sustain? Why?

  1. Are you aware of plans to spread or expand the reporting efforts??



  1. Have other new quality reporting initiatives started in the last year? If so, describe how they interact with the CHIPRA work.

    • Probe on public, multi-payer, and payer specific initiatives.

    • To what extent do child-serving providers participate in these initiatives? (Providers could be practices, hospitals, school-based health centers, federally qualified health centers, and so forth.)

    • How would you compare the impact of the CHIPRA quality demonstration to these initiatives?












Health IT Initiatives (Category B)


[NOTE: Use category B module if state has a category B project and the respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]]


The next few questions will focus on Category B, health IT initiatives.


  1. Please describe how you and your organization have worked with the demonstration project staff to implement activities related to quality measurement in the last year.

    • Probe as needed on involvement in workgroups, provider selection and recruitment, learning collaboratives, development of resources or tools.

  2. Please elaborate on the major strategies or approaches the state used to meet their objectives or milestones in this area for the last year.

    • How has their approach evolved overtime?

  1. How would you characterize their progress over the last year?

    • Are there any milestones you reached?

    • Which, if any, have not been reached on schedule or as planned?

  1. What strategies seem to have worked well? What factors seem to be contributing to progress?

    • Probe for each type of health IT implemented under CHIPRA.

  1. What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did they work to overcome the challenges you’ve described?

  • Probe on IT infrastructure; privacy and security; interoperability.

    • Probe for each type of health IT implemented under CHIPRA.

  1. Is your organization using the new health IT? How?

    • Probe for examples.

  2. How are state agencies using the health IT implemented under the CHIPRA demonstration?

    • To support other parts of the demonstration?

  • To share health information across agencies?

  • To share health information with practices?



  1. Now, we would like to discuss how other stakeholder groups are responding to the new health IT system. [Probe on groups who implement or use new system]

    • How are providers responding to the new system?

      • Favorably, unfavorably, neutral or not aware?

      • Did they implement the new system as planned? Why or why not?

      • How satisfied are they with the usability and functionality of the new system?

      • How are they using the system to improve quality of care? To support other parts of the demonstration? What information do you have to assess these changes?

    • How are MCOs responding to the new system?

      • Favorably, unfavorably, neutral or not aware?

      • Did they implement the new system as planned? Why or why not?

      • How satisfied are they with the usability and functionality of the new system?

      • How are they using the system to improve quality of care? What information do you have to assess these changes?

    • How are patients and families responding to the new health IT system?

      • Favorably, unfavorably, neutral or not aware?

      • How are they responding to the use of new health IT by providers? How did you gather this feedback?

      • [If system is patient-facing] How satisfied are they with the usability and functionality of the new system?

      • [If system is patient-facing] How are they using the system? What information do you have to assess these changes?

    • How are other agencies responding to the quality measures and reports?

      • Favorably, unfavorably, neutral or not aware?

      • Did they implement the new system as planned? Why or why not?

      • How satisfied are they with the usability and functionality of the new system?

      • How are they using the system to improve quality of care? What information do you have to assess these changes?

  1. Do you think the state will be able to sustain the CHIPRA quality measurement and reporting? Why or why not?

    • What will help sustain their efforts?

    • What activities will be difficult to sustain? Why?

  1. Are you aware of plans to spread or expand the reporting efforts??

  2. How does the CHIPRA health IT project interact with other health IT initiatives in the state?

    • How are the activities under CHIPRA unique? What does the CHIPRA project add to the health IT landscape?

    • How are activities under CHIPRA supported by these initiatives? Integrated with these initiatives?

    • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • Probe on EHR meaningful use incentives, RECs, state HIE grant, Beacon programs



  1. How would you characterize the state’s health IT infrastructure?

    • If you were rating the infrastructure on a scale from 1 (very weak) to 10 (very strong), where would you rate your state now? Three years ago?

    • To what extent would you say the CHIPRA quality demonstration helped move the state along this continuum?



Provider-Based Initiatives (Category C)


[NOTE: Use category C module if state has a category C project and the respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]]


The next few questions will focus on Category C, provider based models.


  1. Please describe how you and your organization have worked with the demonstration project staff to implement activities related to quality measurement in the last year.

    • Probe as needed on involvement in workgroups, provider selection and recruitment, learning collaboratives, development of resources or tools.

  1. Please elaborate on the major strategies or approaches the state used to meet their objectives or milestones in this area for the last year.

    • Probe on technical assistance offered to providers including Learning or QI Collaborative sessions, practice coaches or facilitators, staff augmentation like medical home coordinators or quality improvement specialists or title V staff?

    • How has your approach evolved overtime?

  1. How would you characterize their progress over the last year?

    • Are there any milestones you reached?

    • Which, if any, have not been reached on schedule or as planned?

  1. What strategies seem to have worked well? What factors seem to be contributing to progress?

    • Probe for each strategy or approach.

  1. What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did they work to overcome the challenges you’ve described?

    • Probe for each strategy or approach.

  2. Is your organization doing anything differently as a result of the new provider based model being implanted? What?

    • Probe for examples.

  3. Now, we would like to discuss how other stakeholder groups are responding to the demonstration. [Probe on groups who implement or use new system]

    • How are providers responding to the demonstration?

      • Favorably, unfavorably, neutral or not aware?

      • How engaged are they in technical assistance or Learning Collaborative Activities?

      • Did some practices drop out of the initiative or just do the bare minimum? Did they experience problems with burnout or turnover?

      • What changes are they making to how they deliver care? Are they using data or health IT to help implement those changes? What information do you have to assess these changes?









    • How are MCOs responding to the demonstration?

      • Favorably, unfavorably, neutral or not aware?

      • How engaged are they in technical assistance or Learning Collaborative Activities?

      • What changes are they making as a result of the demonstration? Probe on changes to provider networks and provider payment methodologies. What information do you have to assess these changes?

    • How are patients and families responding to changes at the practice level?

      • Favorably, unfavorably, neutral or not aware?

      • How satisfied are they with the changes? How did you gather this feedback?

    • How are other stakeholders responding to the demonstration?

      • Favorably, unfavorably, neutral or not aware?

  1. Do you think the state will be able to sustain the CHIPRA quality measurement and reporting? Why or why not?

    • What will help you sustain your efforts?

    • What activities will be difficult to sustain? Why?

  1. Are you aware of plans to spread or expand the reporting efforts?

  2. What new provider-based initiatives to improve quality of care with [physicians/SBHCs/CMEs] started in the last year? Please include private or commercials initiatives that you know of. How did the new initiatives interact with CHIPRA activities, if at all?

    • What was the scope of the project? What types of providers and how many participated?

    • How successful were these initiatives? What factors made these initiatives more or less successful?

    • Are you integrating your category C project with these initiatives? If so, how?

      • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • Can you identify anything about these other initiatives that were critical to the implementation of your CHIPRA health IT project?

    • How would you compare the impact of the CHIPRA quality demonstration to these initiatives?

Pediatric Electronic Health Record (Category D)


[NOTE: Use category D module if state has a category D project and the respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]


The next few questions will focus on Category D, testing the model electronic health record.

When we were here in 2012, the state was [Summarize progress as of our last visit (e.g., finishing a gap analysis to compare the model EHR to existing ones.]


  1. Please describe how you and your organization have worked with the demonstration project staff to implement activities related to quality measurement in the last year.

    • Probe as needed on involvement in workgroups, provider selection and recruitment, learning collaboratives, development of resources or tools.

  1. Please elaborate on the major strategies or approaches the state used to meet their objectives or milestones in this area for the last year.

    • How has your approach evolved overtime?

  1. How would you characterize their progress over the last year?

    • Are there any milestones you reached?

    • Which, if any, have not been reached on schedule or as planned?



  1. What strategies seem to have worked well? What factors seem to be contributing to progress?

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did they work to overcome the challenges you’ve described?

    • Skills and experience of staff, effective planning, high-level support, other external factors.

  1. How have vendors responded to the model format?

  • What vendors did you work with?

    • Have you encountered any issues with vendors? If so, how are you addressing them?

    • How did CMS or ONC help the state work with vendors?



  1. Is your organization doing anything different as a result of the state testing the model EHR format? What?

    • Probe for examples.

  2. [If providers implemented the system] How did providers respond to the model format?

    • Favorably, unfavorably, neutral or not aware?

    • Did they implement the new system as planned? Why or why not?

    • How satisfied are they with the usability and functionality of the new system?

    • How are they using the system to improve quality of care? How does it interact with other CHIPRA activities? What information do you have to assess these changes?

  3. [If providers implemented the system] How are patients and families responding to the model format?

    • Favorably, unfavorably, neutral or not aware?

    • Are parents involved in testing?

    • What kind of feedback are they giving on the demonstration? How did you gather this feedback?

  4. Do you think the state will be able to sustain the CHIPRA quality measurement and reporting? Why or why not?

    • What will help sustain their efforts?

    • What activities will be difficult to sustain? Why?

  1. Are you aware of plans to spread or expand the reporting efforts?

  2. [If providers implemented the system] What other initiatives are helping participating providers improve their health IT capacity? How do they interact with CHIPRA?

    • Probe on EHR meaningful use incentives, RECs, state HIE grant, Beacon programs

    • How successful were these initiatives? What factors made these initiatives more or less successful?

    • Is the state integrating your category D project with these initiatives? If so, how?

      • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.


Other Quality Initiatives (Category E)


[NOTE: Use category E module if state has a category E project and the respondent has specialized knowledge.]


The next few questions will focus on Category E, other quality initiatives. [NOTE: Interviewer will customize these questions based on the state’s Category E initiative.]


  1. Please describe how you and your organization have worked with the demonstration project staff to implement activities related to quality measurement in the last year.

    • Probe as needed on involvement in workgroups, provider selection and recruitment, learning collaboratives, development of resources or tools.

  1. Please elaborate on the major strategies or approaches the state used to meet their objectives or milestones in this area for the last year.

    • How has your approach evolved overtime?

  1. How would you characterize the state’s progress over the last year?

    • Are there any milestones you reached?

    • Which, if any, have not been reached on schedule or as planned?

  1. What strategies seem to have worked well? What factors seem to be contributing to progress?

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did they work to overcome the challenges you’ve described?

    • Skills and experience of staff, effective planning, high-level support, other external factors.

  1. Is your organization doing anything differently as a result of the Category E project? What?

    • Probe for examples.









  1. Now, we would like to discuss how other groups are responding to the Category E project. [Probe on groups who implement or use new system]

    • How are providers responding to the Category E project?

      • Favorably, unfavorably, neutral or not aware?

      • How engaged are they in technical assistance or Learning Collaborative Activities?

      • What changes are they making to how they deliver care? Are they using data or health IT to guide those changes? What information do you have to assess these changes?

    • How are MCOs responding to the Category E project?

      • Favorably, unfavorably, neutral or not aware?

      • How engaged are they in technical assistance or Learning Collaborative Activities?

      • What changes are they making as a result of the demonstration? Probe on changes to provider networks and provider payment methodologies. What information do you have to assess these changes?

    • How are patients and families responding the Category E project?

      • Favorably, unfavorably, neutral or not aware?

      • What kind of feedback are they giving on the demonstration? How did you gather this feedback?

    • How are other stakeholders responding to the demonstration?

      • Favorably, unfavorably, neutral or not aware?

      • How are stakeholders making changes at their organization? What information do you have to assess these changes?

  1. Do you think the state will be able to sustain the CHIPRA quality measurement and reporting? Why or why not?

    • What will help sustain their efforts?

    • What activities will be difficult to sustain? Why?

  1. Are you aware of plans to spread or expand the reporting efforts?







  1. What new similar initiatives were started in the state in the last year? How did the new initiatives interact with CHIPRA activities, if at all?

    • How successful were these initiatives? What factors made these initiatives more or less successful?

    • Are you integrating your category E project with these initiatives? If so, how?

      • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • Can you identify anything about these other initiatives that were critical to the implementation of your Category E project?

    • How would you compare the impact of the CHIPRA quality demonstration to these initiatives?

III. Stakeholder Involvement

  1. Overall, how satisfied are you with the level of involvement your organization has in implementing the demonstration?

  2. What changes, if any, do you suggest the state make to its strategy for involving stakeholders?

    • Involve additional stakeholders? Different stakeholders? Engage them in a different manner?


IV. Indications of Change

        1. How, if at all, has the CHIPRA demonstraiton resulted in positive impacts for children in the [State]? [Have logic model handy if useful]

  • Please provide specific examples. Probe on shorter term intermediate outcomes and ultimate impacts

  • Do you have evidence that the quality of care improved?










V. State Context


Let’s move on to some contextual questions.


  1. Please tell me about any major state-level changes in the past year that have affected the CHIPRA quality demonstration in your state?

    • Changes in governor, legislature, key agency staff, or changes in their level of support? Changes in the state’s budget outlook?

  • What impact did they have on pace, strategy, or approach? On the expected outcomes from the demonstration?

    • Pace? Strategy or approach?

  1. Have there been major changes in the health sector in your state in the past year?

    • Growth or other changes in managed care in Medicaid or CHIP?

    • Greater interest to cut Medicaid or CHIP costs?

    • The willingness of private health plans to participate in multi-payer efforts that involve Medicaid or CHIP?

    • New or changing PCMH efforts in private plans or being advocated by professional associations?

  2. What effect, if any, did these changes have on the implementation of the demonstration? On the expected outcomes from the demonstration?

    • Pace? Strategy or approach?

  3. How has implementation of the Affordable Care Act impacted the demonstration, if at all?

    • Pace? Strategy or approach?

  4. Any other important changes we should know about?

  5. What impact are these changes having on the implementation of the demonstration?

    • Pace? Strategy or approach?







VI. Lessons Learned

  1. What lessons have you learned or what insights and advice might you have for other states trying to implement similar quality improvement projects?

  • Any overarching or specific lessons you’d like to share are fine. But, we’re also interested in your thoughts on any specific categories or projects.

  • Is there anything you would do differently if you could?

  1. How confident are you that another state could implement a similar project?

    • What elements of the project are essential for achieving the same results?

    • What about your state made it easier to implement? What made it more difficult?



IX. Wrap Up

  1. You have answered all my questions. Is there anything I didn’t ask that you’d like to tell me about?

Thank you very much for making time to speak with us.






























File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLeslie Foster
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy