2014 Key Staff INTERVIEW GUIDE
Form
Approved OMB
No:
Exp.
Date
KEY STAFF
Thank you for speaking with us today. In the email we sent confirming this interview, we provided information on who we are, why we’re here, what topics we’re interested in talking about, and we assured you that your responses will be kept confidential. Do you have any questions before we start the interview? If not, may we begin recording the conversation?
If the respondent did not receive or does not remember the confirmation email or if they have questions about the information provided in the email, review the introduction to the study on next page.
Public
reporting burden for this collection of information is estimated to
average 90 minutes per response, the estimated time to complete the
interview. An agency may not conduct or sponsor, and a person is not
required to respond to, a collection of information unless it
displays a currently valid OMB control number. Send comments
regarding this burden estimate or any other aspects of this
collection of information, including suggestions for reducing this
burden, to: AHRQ Reports Clearance Officer Attn: PRA, Paperwork
Reduction Project (X) AHRQ, 540 Gaither Rd., Rm. 5036, Rockville MD
20850.
Introduction to Study
We are from the Urban Institute and Mathematica Policy Research, and we are part of the team conducting the cross-state evaluation of the CHIPRA Quality Demonstration. The evaluation is federally funded by the Agency for Healthcare Research and Quality (AHRQ). The Centers for Medicare & Medicaid Services (CMS) provides the grants to states.
We are now conducting our second round of visits to the 18 demonstration states. We are meeting with people who are closely involved in the design, management, and day-to-day operations of each state’s demonstration initiatives, as well as other people who care about how the demonstrations affect children’s care quality in Medicaid and CHIP.
We are particularly interested in your thoughts and insights on several topics, including:
Your experience implementing your state’s quality demonstration, such as:
Major changes to strategies and activities for achieving your goals
What seems to have worked well
What seems to have worked less well
Changes you have made
Your experience working with partner states
Evidence that your strategies may be having the desired effect
Plans for sustaining and spreading your initiatives
Major changes in the state in the last year that impacted your ability to implement the demonstration or may influence the demonstration outcomes.
Our interview will take an hour and a half. Your responses will be kept confidential to the extent permitted by law, including AHRQ’s confidentiality statute, 42 USC 299c-3(c). Only evaluation team members will have access to your responses. We will report to AHRQ and CMS on the general themes that emerge from all of our discussions. Some reports may list the people we spoke to in a state, but we will not attribute specific comments or quotes to named individuals without permission. We would like to record our discussion in case we miss something in our notes and want to go back and listen. But, we do not plan to transcribe the recording.
Do you have any questions before we start? May I begin recording?
I. Introduction/Background
We have read your most recent grantee progress report and other documents grantees submit to CMS. In many of my questions today I’ll ask you to elaborate on those reports. My first questions are for background.
Please tell me what your position is at here at [this organization] and how long you have worked here.
What are your responsibilities in connection with the state’s CHIPRA Quality Demonstration grant? Are you directly involved in some grant categories, but not others? Which ones?
II. Strategies
My next questions are about the CHIPRA Quality Demonstration in [this state].
[NOTE: Use the cross-strategy module if respondent has general knowledge. Use a category-specific module if respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]
Cross-Category
Please briefly describe the major strategies or approaches you have used to meet the state’s objectives or milestones for the last year.
How has your approach evolved overtime?
And how would you characterize your progress over the last year?
Are there any milestones you reached?
Which, if any, have not been reached on schedule or as planned?
What strategies seem to have worked well? What factors seem to be contributing to progress?
Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.
What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did you work to overcome the challenges you’ve described?
Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.
Now, we would like to discuss how different stakeholder groups are responding to the demonstration. [Probe on groups involved in the demonstration]
How are providers responding to the demonstration?
Favorably, unfavorably, neutral or not aware?
What changes are they making to how they deliver care? What information do you have to assess these changes?
How are MCOs responding to the demonstration?
Favorably, unfavorably, neutral or not aware?
What changes are they making as a result of the demonstration? Probe on changes to provider networks and provider payment methodologies. What information do you have to assess these changes?
How are patients and families responding to the demonstration?
Favorably, unfavorably, neutral or not aware?
What kind of feedback are they giving on the demonstration? How did you gather this feedback?
How are any other stakeholders, such as hospitals or consumer advocates, responding to the demonstration?
Favorably, unfavorably, neutral or not aware?
What changes are they making at their organization? What information do you have to assess these changes?
What are your plans for sustaining the CHIPRA activities after the grant period ends?
What will help you sustain your efforts?
What activities will be difficult to sustain? Why?
What are your plans for spreading or expanding the CHIPRA activities?
III. Stakeholder Involvement
Now, I would like to talk about the involvement of other demonstration stakeholders, such as professional associations, state AAP chapters, and consumer advocacy groups that may play an advisory role on your projects.
What changes, if any, did you make to your strategy for involving stakeholders?
Involve additional stakeholders? Different stakeholders? Engage them in a different manner? Use a different method to encourage collaboration?
How has the degree or nature stakeholder involvement facilitated or created challenges to implementing the demonstration?
How satisfied have you been with the level of stakeholder involvement in the demonstration?
Has the level of involvement changed since the demonstration started?
Probe as needed on involvement in each category.
IV. Indications of Change
How, if at all, has the CHIPRA demonstraiton resulted in positive impacts for children in the [State]? [Have logic model handy if useful]
Please provide specific examples. Probe on shorter term intermediate outcomes and ultimate impacts
Do you have evidence that the quality of care improved?
V. State Context
Let’s move on to some contextual questions.
Please tell me about any major state-level changes in the past year that have affected the CHIPRA quality demonstration in your state?
Changes in governor, legislature, key agency staff, or changes in their level of support? Changes in the state’s budget outlook?
What impact did they have on pace, strategy, or approach? On the expected outcomes from the demonstration?
Pace? Strategy or approach?
Have there been major changes in the health sector in your state in the past year?
Growth or other changes in managed care in Medicaid or CHIP?
Greater interest to cut Medicaid or CHIP costs?
The willingness of private health plans to participate in multi-payer efforts that involve Medicaid or CHIP?
New or changing PCMH efforts in private plans or being advocated by professional associations?
What effect, if any, did these changes have on the implementation of the demonstration? On the expected outcomes from the demonstration?
Pace? Strategy or approach?
How has implementation of the Affordable Care Act impacted the demonstration, if at all?
Pace? Strategy or approach?
Any other important changes we should know about?
What impact are these changes having on the implementation of the demonstration?
Pace? Strategy or approach?
VI. Experience as a Demonstration Grantee
[For multi-state grantees] Please comment on any benefits or challenges of working with partner states.
[For solo-state grantees] CMS seemed to encourage states to apply for the CHIPRA Quality grants in partnership with other states. A few states, like yours, applied as solo grantees. Does it still seem like the right decision? Even though you don’t have a partner state, per se, are there ways you benefit from interacting with other grantees?
How well has CMS’s grant structure served you as a grantee. What’s worked well? What would you change if you could?
Are the CMS reporting requirements for the grant reasonable?
Do you feel you had enough time to plan and/or implement your demonstration? Do you feel your final operational plan was adequate to guide implementation? Please explain.
Do you have any unmet needs for technical assistance that CMS might provide or help you access?
Participating in a federal grant program can be a catalyst for change, and it can present opportunity costs. Please talk about your experience as a CHIPRA grantee in these terms.
To what extent has the grant enabled the state to do something it would not have done otherwise?
To what extent has the grant prevented the state from doing something else? Would you have pursued different quality initiatives if you did not receive a CHIPRA grant?
How satisfied are you at this point with the state’s return on investment? (You don’t have to answer in monetary terms.) What makes you say that?
How are you tracking your return on investment? [Ask to receive relevant documents after the interview.]
Have you experienced any unexpected costs? [Probe as needed on different categories.]
VII. Lessons Learned
What lessons have you learned or what insights and advice might you have for other states trying to implement similar quality improvement projects?
Any overarching or specific lessons you’d like to share are fine. But, we’re also interested in your thoughts on any specific categories or projects.
Is there anything you would do differently if you could?
How confident are you that another state could implement a similar project?
What elements of the project are essential for achieving the same results?
What about your state made it easier to implement? What made it more difficult?
Would you recommend CMS extending or expanding the CHIPRA quality demonstration?
What would they need to consider before expanding the demonstration?
What could they do better to support states?
VIII. Wrap Up
You have answered all my questions. Is there anything I didn’t ask that you’d like to tell me about?
Thank you very much for making time to speak with us.
[Interviewer: Use Sections XI to XV with Other Key Staff as Appropriate]
XI. Quality Measurement Initiatives (Category A)
[NOTE: Use category A module if state has a category A project and the respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]
The next few questions will focus on Category A, collecting and reporting quality measures. We are interested in the state’s efforts to report the initial and new CHIPRA core measures and other measures developed by the State. When talking about the measures, please specify if they are being reported at the state, health plan, or provider level.
Please briefly describe the major strategies or approaches you have used to meet the state’s objectives or milestones in this area for the last year.
Probe on the initial and new core set and measures developed by the state.
Probe on efforts to report at the state, plan, and practice level and efforts to publicly report measures.
How has your approach evolved overtime?
How would you characterize your progress over the last year?
Are there any milestones you reached?
Which, if any, have not been reached on schedule or as planned?
What strategies seem to have worked well? What factors seem to be contributing to progress?
Probe on the initial and new core set and measures developed by the state.
Probe on efforts to report at the state, plan, and practice level and efforts to publicly report measures.
What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did you work to overcome the challenges you’ve described?
Probe on the initial and new core set and measures developed by the state.
Probe on efforts to report at the state, plan, and practice level and efforts to publicly report measures.
Which measures were relatively easy to collect? Why?
Which measures were more difficult? Why?
Were changes to the CMS core set and measure specifications helpful?
Were the changes clear?
Were they responsive to your needs?
How is the state using the measures?
To support other parts of the demonstration?
To prepare reports? If so, please describe. Probe as needed on report audience, contents, and reporting frequency.
To support changes to payment structures or as a pay for performance measure? If so, please describe. Probe as needed on selection of core measures, weighting of measures, and provider/plan response.
To support other agency initiatives?
Now, we would like to discuss how different stakeholder groups responded to the quality measures and reports. [Probe on new proposed CHIPRA quality measures and new measures developed by the state. Probe for groups who will collect or receive measure data]
How are providers responding to the quality measures and reports?
Favorably, unfavorably, neutral or not aware?
How engaged were they in data collection?
How are they using the measures and reports? To support other parts of the demonstration? What information do you have to assess their use?
How are MCOs responding to the quality measures and reports?
Favorably, unfavorably, neutral or not aware?
How engaged were they in data collection?
How are they using the measures and reports? What information do you have to assess their use?
How are patients and families responding to the quality measures and reports?
Favorably, unfavorably, neutral or not aware?
What kind of feedback do you hear from families? How did you gather this feedback?
How are they using the data to guide their decisions? What information do you have to assess their use?
How are any other stakeholders, such as hospitals or consumer advocates, responding to the quality measures and reports?
Favorably, unfavorably, neutral or not aware?
How are they using the measures and reports at their organization? What information do you have to assess their use?
What are your plans for sustaining the CHIPRA activities in this area after the grant period ends?
What will help you sustain your efforts?
What activities will be difficult to sustain? Why?
What are your plans for spreading or expanding the CHIPRA activities?
What will help you spread your efforts?
Have other new quality reporting initiatives started in the last year? If so, describe how they interact with the CHIPRA work.
Probe on public, multi-payer, and payer specific initiatives.
To what extent do child-serving providers participate in these initiatives? (Providers could be practices, hospitals, school-based health centers, federally qualified health centers, and so forth.)
Are you integrating data collection and reporting for the core measures with data collection and reporting for this other initiative? If so, how?
Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.
How would you compare the impact of the CHIPRA quality demonstration to these initiatives?
XII. Health IT Initiatives (Category B)
[NOTE: Use category B module if state has a category B project and the respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]]
The next few questions will focus on Category B, health IT initiatives.
Please briefly describe the major strategies or approaches you have used to meet the state’s objectives or milestones in this area for the last year.
How has your approach evolved overtime?
How would you characterize your progress over the last year?
Are there any milestones you reached?
Which, if any, have not been reached on schedule or as planned?
What strategies seem to have worked well? What factors seem to be contributing to progress?
Probe for each type of health IT implemented under CHIPRA.
What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did you work to overcome the challenges you’ve described?
Probe on IT infrastructure; privacy and security; interoperability.
Probe for each type of health IT implemented under CHIPRA.
How are state agencies using the health IT implemented under the CHIPRA demonstration?
To support other parts of the demonstration?
To share health information across agencies?
To share health information with practices?
Now, we would like to discuss how different stakeholder groups are responding to the new health IT system. [Probe on groups who implement or use new system]
How are providers responding to the new system?
Favorably, unfavorably, neutral or not aware?
Did they implement the new system as planned? Why or why not?
How satisfied are they with the usability and functionality of the new system?
How are they using the system to improve quality of care? To support other parts of the demonstration? What information do you have to assess these changes?
How are MCOs responding to the new system?
Favorably, unfavorably, neutral or not aware?
Did they implement the new system as planned? Why or why not?
How satisfied are they with the usability and functionality of the new system?
How are they using the system to improve quality of care? What information do you have to assess these changes?
How are patients and families responding to the new health IT system?
Favorably, unfavorably, neutral or not aware?
How are they responding to the use of new health IT by providers? How did you gather this feedback?
[If system is patient-facing] How satisfied are they with the usability and functionality of the new system?
[If system is patient-facing] How are they using the system? What information do you have to assess these changes?
How are other agencies, such as hospitals or consumer advocates, responding to the quality measures and reports?
Favorably, unfavorably, neutral or not aware?
Did they implement the new system as planned? Why or why not?
How satisfied are they with the usability and functionality of the new system?
How are they using the system to improve quality of care? What information do you have to assess these changes?
What are your plans for sustaining CHIPRA activities in this area after the grant period ends?
What will help you sustain your efforts?
What activities will be difficult to sustain? Why?
What are your plans for spreading or expanding the CHIPRA activities?
What will help you spread your efforts?
How does your CHIPRA health IT project interact with other health IT initiatives in the state?
How are your activities under CHIPRA unique? What does your CHIPRA project add to the health IT landscape?
How are your activities under CHIPRA supported by these initiatives? Integrated with these initiatives?
Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.
Probe on EHR meaningful use incentives, RECs, state HIE grant, Beacon programs
How would you characterize the state’s health IT infrastructure?
If you were rating the infrastructure on a scale from 1 (very weak) to 10 (very strong), where would you rate your state now? Three years ago?
To what extent would you say the CHIPRA quality demonstration helped move the state along this continuum?
XIII. Provider-Based Initiatives (Category C)
[NOTE: Use category C module if state has a category C project and the respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]]
The next few questions will focus on Category C, provider based models.
Please briefly describe the major strategies or approaches you have used to meet the state’s objectives or milestones in this area for the last year.
Probe on technical assistance offered to providers including Learning or QI Collaborative sessions, practice coaches or facilitators, staff augmentation like medical home coordinators or quality improvement specialists or title V staff?
How has your approach evolved overtime?
How would you characterize your progress over the last year?
Are there any milestones you reached?
Which, if any, have not been reached on schedule or as planned?
What strategies seem to have worked well? What factors seem to be contributing to progress?
Probe for each strategy or approach.
What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did you work to overcome the challenges you’ve described?
Probe for each strategy or approach.
Now, we would like to discuss how different stakeholder groups are responding to the demonstration. [Probe on groups who implement or use new system]
How are providers responding to the demonstration?
Favorably, unfavorably, neutral or not aware?
How engaged are they in technical assistance or Learning Collaborative Activities?
Did some practices drop out of the initiative or just do the bare minimum? Did they experience problems with burnout or turnover?
What changes are they making to how they deliver care? Are they using data or health IT to help implement those changes? What information do you have to assess these changes?
How are MCOs responding to the demonstration?
Favorably, unfavorably, neutral or not aware?
How engaged are they in technical assistance or Learning Collaborative Activities?
What changes are they making as a result of the demonstration? Probe on changes to provider networks and provider payment methodologies. What information do you have to assess these changes?
How are patients and families responding to changes at the practice level?
Favorably, unfavorably, neutral or not aware?
How satisfied are they with the changes? How did you gather this feedback?
How are other stakeholders responding to the demonstration?
Favorably, unfavorably, neutral or not aware?
What are your plans for sustaining CHIPRA activities in this area after the grant period ends?
What will help you sustain your efforts?
What activities will be difficult to sustain? Why?
What are your plans for spreading or expanding the CHIPRA activities?
What will help you spread your efforts?
What new provider-based initiatives to improve quality of care with [physicians/SBHCs/CMEs] started in the last year? Please include private or commercials initiatives that you know of. How did the new initiatives interact with CHIPRA activities, if at all?
What was the scope of the project? What types of providers and how many participated?
How successful were these initiatives? What factors made these initiatives more or less successful?
Are you integrating your category C project with these initiatives? If so, how?
Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.
Can you identify anything about these other initiatives that were critical to the implementation of your CHIPRA health IT project?
How would you compare the impact of the CHIPRA quality demonstration to these initiatives?
XIV. Pediatric Electronic Health Record (Category D)
[NOTE: Use category D module if state has a category D project and the respondent has specialized knowledge. Review interview notes and progress reports to customize the protocols. Probe on how strategies, barriers, and facilitators have changed overtime.]
The next few questions will focus on Category D, testing the model electronic health record.
When we were here in 2012, the state was [Summarize progress as of our last visit (e.g., finishing a gap analysis to compare the model EHR to existing ones.]
What goals and objectives have the category D project team been trying to meet since then?
Please briefly describe the major strategies or approaches you have used to meet the state’s objectives or milestones in this area for the last year.
How has your approach evolved overtime?
How would you characterize your progress over the last year?
Are there any milestones you reached?
Which, if any, have not been reached on schedule or as planned?
What strategies seem to have worked well? What factors seem to be contributing to progress?
Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.
What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did you work to overcome the challenges you’ve described?
Skills and experience of staff, effective planning, high-level support, other external factors.
Please describe your state’s level of involvement in the implementation of the model EHR format (if applicable).
How satisfied are you with the level of involvement your state has in implementing the model EHR format?
How have vendors responded to the model format?
What vendors did you work with?
Have you encountered any issues with vendors? If so, how are you addressing them?
How did CMS or ONC help the state work with vendors?
[If providers implemented the system] What resources, including training, materials, or tools, did you make available to providers to help them implement the new system? [Ask for copies of slides, materials, tools, or other resources.]
How did providers respond to those resources?
[If providers implemented the system] How did providers respond to the model format?
Favorably, unfavorably, neutral or not aware?
Did they implement the new system as planned? Why or why not?
How satisfied are they with the usability and functionality of the new system?
How are they using the system to improve quality of care? How does it interact with other CHIPRA activities? What information do you have to assess these changes?
[If providers implemented the system] How are patients and families responding to the model format?
Favorably, unfavorably, neutral or not aware?
Are parents involved in testing?
What kind of feedback are they giving on the demonstration? How did you gather this feedback?
[If providers implemented the system] If possible, please quantify the start-up costs of the new system.
What are your plans for sustaining CHIPRA activities in this area after the grant period ends?
What will help you sustain your efforts?
What activities will be difficult to sustain? Why?
What are your plans for spreading or expanding the CHIPRA activities?
What will help you to spread your activities?
[If providers implemented the system] What other initiatives are helping participating providers improve their health IT capacity? How do they interact with CHIPRA?
Probe on EHR meaningful use incentives, RECs, state HIE grant, Beacon programs
How successful were these initiatives? What factors made these initiatives more or less successful?
Are you integrating your category D project with these initiatives? If so, how?
Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.
XV. Other Quality Initiatives (Category E)
[NOTE: Use category E module if state has a category E project and the respondent has specialized knowledge.]
The next few questions will focus on Category E, other quality initiatives. [NOTE: Interviewer will customize these questions based on the state’s Category E initiative.]
Please briefly describe the major strategies or approaches you have used to meet the state’s objectives or milestones in this area for the last year.
How has your approach evolved overtime?
How would you characterize your progress over the last year?
Are there any milestones you reached?
Which, if any, have not been reached on schedule or as planned?
What strategies seem to have worked well? What factors seem to be contributing to progress?
Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.
What strategies seem to have worked less well? What factors seem to be inhibiting progress? How did you work to overcome the challenges you’ve described?
Skills and experience of staff, effective planning, high-level support, other external factors.
Now, we would like to discuss how different groups are responding to the Category E project. [Probe on groups who implement or use new system]
How are providers responding to the Category E project?
Favorably, unfavorably, neutral or not aware?
How engaged are they in technical assistance or Learning Collaborative Activities?
What changes are they making to how they deliver care? Are they using data or health IT to guide those changes? What information do you have to assess these changes?
How are MCOs responding to the Category E project?
Favorably, unfavorably, neutral or not aware?
How engaged are they in technical assistance or Learning Collaborative Activities?
What changes are they making as a result of the demonstration? Probe on changes to provider networks and provider payment methodologies. What information do you have to assess these changes?
How are patients and families responding the Category E project?
Favorably, unfavorably, neutral or not aware?
What kind of feedback are they giving on the demonstration? How did you gather this feedback?
How are other stakeholders responding to the demonstration?
Favorably, unfavorably, neutral or not aware?
How are stakeholders making changes at their organization? What information do you have to assess these changes?
What are your plans for sustaining CHIPRA activities in this area after the grant period ends?
What will help you sustain your efforts?
What activities will be difficult to sustain? Why?
What are your plans for spreading or expanding the CHIPRA activities?
What will help you to spread your activities?
What new similar initiatives were started in the state in the last year? How did the new initiatives interact with CHIPRA activities, if at all?
How successful were these initiatives? What factors made these initiatives more or less successful?
Are you integrating your category E project with these initiatives? If so, how?
Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.
Can you identify anything about these other initiatives that were critical to the implementation of your Category E project?
How would you compare the impact of the CHIPRA quality demonstration to these initiatives?
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Leslie Foster |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |