Form 2 2012 Implementation Staff Interview Guide

Evaluation of the Children's Health Insurance Program Reauthorization Act of 2009 (CHIPRA) Quality Demonstration Grant Program

Attachment E- 2012 Implementation Staff Interview Guide

Implementation Staff Interviews

OMB: 0935-0190

Document [docx]
Download: docx | pdf







Attachment E

2012 IMPLEMENTATION Staff INTERVIEW GUIDE

Form Approved

OMB No: 0935-0190

Exp. Date 02/28/2015



CHIPRA QUALITY DEMONSTRATION EValuation

2012 Interview guide


OTHER IMPLEMENTATION PERSONNEL




Thank you for speaking with us today. In the email we sent confirming this interview, we provided information on who we are, why we’re here, what topics we’re interested in talking about, and we assured you that your responses will be kept confidential. Do you have any questions before we start the interview? If not, may we begin recording the conversation?



If the respondent did not receive or does not remember the confirmation email or if they have questions about the information provided in the email, review the introduction to the study on next page.


Public reporting burden for this collection of information is estimated to average 60 minutes per response, the estimated time to complete the interview. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. Send comments regarding this burden estimate or any other aspects of this collection of information, including suggestions for reducing this burden, to: AHRQ Reports Clearance Officer Attn: PRA, Paperwork Reduction Project (0935-XXXX) AHRQ, 540 Gaither Rd., Rm. 5036, Rockville MD 20850.



Introduction to Study


We are from the Urban Institute and Mathematica Policy Research, and we are part of the team conducting the cross-state evaluation of the CHIPRA Quality Demonstration. The evaluation is federally funded by the Agency for Healthcare Research and Quality (AHRQ). The Centers for Medicare & Medicaid Services (CMS) provides the grants to states.


We are now conducting our first round of visits to the 18 demonstration states. We are meeting with people who are closely involved in the design, management, and day-to-day operations of each state’s demonstration initiatives, as well as other people who care about how the demonstrations affect children’s care quality in Medicaid and CHIP.


We are particularly interested in your thoughts and insights on several topics, including:


  1. Changes in the state that have occurred since you submitted your final operational plan to CMS (late 2010 or early 2011) that may affect your demonstration plans or your ability to implement them.


  1. Your experience to date implementing your state’s quality demonstration, such as:

    1. Major strategies and activities for achieving your goals

    2. What seems to be working well

    3. What seems to be working less well

    4. Changes you have made


  1. Evidence to date that your strategies may be having the desired effect


  1. Key issues for the coming year


Our interview will take an hour. Your responses will be kept confidential to the extent permitted by law, including AHRQ’s confidentiality statute, 42 USC 299c-3(c). We will share everyone’s comments with members of the evaluation team and we will report to AHRQ and CMS on the general themes that emerge from all of our discussions. Our reports will list the people we spoke to in each state, but we will not attribute specific comments or quotes to named individuals without permission. We would like to record our discussion in case we miss something in our notes and want to go back and listen. But, we do not plan to transcribe the recording.


Do you have any questions before we start? May I begin recording?



I. Introduction/Background


My first questions are for background.


  1. Please tell me what your position is at here at [this organization] and how long you have worked here.

  2. What are your responsibilities in connection with the state’s CHIPRA Quality Demonstration grant? Are you directly involved in some grant categories, but not others? Which ones?

  3. In your own words, what are the state’s major goals for the demonstration?

    • What improvements to care quality in Medicaid and CHIP does the state want to achieve in the next four years?

  • What impact on utilization and expenditure does the state hope to achieve?

  • How does the state hope to impact transparency and consumer choice?


II. State Context


Let’s move on to some contextual questions.


  1. We understand that [researcher summarizes known information on state-level changes]. Have there been any other major state-level changes since the final operational plan was submitted in December 2010?

    • Changes in governor, legislature, key agency staff, or changes in their level of support? Changes in the state’s budget outlook?

  2. What impact, if any, are these changes having on the implementation of the demonstration?

    • Pace? Strategy or approach?

  3. We understand that [researcher summarizes known information on health sector changes.] Have there been any other major changes in the health sector?

    • Growth or other changes in managed care in Medicaid or CHIP?

    • Greater interest to cut Medicaid or CHIP costs?

    • The willingness of private health plans to participate in multi-payer efforts that involve Medicaid or CHIP?

    • New or changing PCMH efforts in private plans or being advocated by professional associations?

  4. What impact are these changes having on the implementation of the demonstration?

    • Pace? Strategy or approach?

  5. Any other changes that are outside of, but important to, the CHIPRA Quality Demonstration?

  6. What impact are these changes having on the implementation of the demonstration?

    • Pace? Strategy or approach?


III. Strategies


My next questions are about the CHIPRA Quality Demonstration in [this state].


[Interview: Use the cross-strategy module if respondent has general knowledge, or a category-specific module if respondent has specialized knowledge.]


Cross-Category



  1. What would you describe as the major goals for this first year of the demonstration in your state (again, we’re referring to the period since your final operational plan)?

    • What goals or milestones have you most focused on achieving?

  2. Please briefly describe the major strategies or approaches the state has been using to accomplish those goals.

  3. And how would you characterize the progress toward those goals and milestones?

    • Which, if any, have you reached or come close to reaching?

    • Which, if any, have not been reached on schedule or as planned?

    • How are you monitoring your progress?

  4. What strategies seem to be working well? What factors seem to be contributing to progress?

  • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. What strategies seem to be working less well? What factors seem to be inhibiting progress?

  • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. How are you trying to overcome the challenges you’ve described? What approaches is the state using or considering?

    • Probe for examples.



Quality Measurement Initiatives (Category A)


[NOTE: Use category A module if state has a category A project and the respondent has specialized knowledge.]


The next few questions will focus on Category A, collecting and reporting the core set of CHIPRA quality measures.


  1. [Researcher should briefly summarize goals mentioned previously.] What would you describe as the major goals in this area for the first year of the demonstration?

    • What goals or milestones have you most focused on achieving in this area?

  1. Please briefly describe the major strategies or approaches the state has been using this year to accomplish the goals in this area.

    • Probe as needed on changes to data infrastructure or reporting systems.

  2. How would you characterize this year’s progress toward those goals and milestones related to quality measurement?

    • Which, if any, have you reached or come close to reaching?

    • Which, if any, have not been reached on schedule or as planned?

    • How are you monitoring your progress?

  1. Thinking beyond this year for a moment, would you say the major strategies and outcomes in this logic model still represent your overall approach to this area of your demonstration in the long run? Are there any important differences? You’ve mentioned x and y strategy, but z from the FOP was not mentioned. Is that strategy coming later? In your experience so far, what strategies seem to be working well? What factors seem to be contributing to progress?

    • Please describe specific tactics or processes you have used that have worked well.

    • Probe as needed on data infrastructure, skills and experience of staff, effective planning, high-level support, other external factors.

  1. What strategies seem to be working less well? What factors seem to be inhibiting progress?

    • Please think about tactics or processes that have been less successful.

    • Probe as needed on data infrastructure, skills and experience of staff, effective planning, high-level support, other external factors.

  1. How are you trying to overcome the challenges you’ve described? What approaches is the state using or considering?

    • Probe for examples.



  1. In your view, how comprehensive and clear were the CMS measure specifications?

    • What areas were most unclear? How could the specifications be improved, if at all?

  2. [If providers are being asked to collect and report on measures] How did the project team go about recruiting and selecting providers to collect and report on the measures?

    • What factors were considered in selecting providers? Their health IT sophistication? Past experience collecting and reporting core measures? Past experience implementing quality improvement initiatives? Or were providers selected primarily to participate in another grant category that also involves category A?

    • If needed, probe on the number of targeted providers, the number actually recruited, and how quickly or easily they were recruited.

    • Did you experience any issues establishing data use and end user data agreements?

    • How similar or different are the participating providers from other providers in the state?

    • What was the nature of the state’s relationship with these providers before the

CHIPRA demonstration began?

  1. [If providers are being asked to collect and report on measures] What strategy are you using to motivate providers to collect and report on the measures?

    • Are you providing financial incentives such as one-time payments, performance bonuses, changes in reimbursement?

    • Are you providing other incentives, such as free software, or technical assistance to implement the project?

    • How are providers responding to the incentives so far? Would you draw any early conclusions about the incentives based on your observations?

  1. [If providers are being asked to collect and report on measures] Are you providing resources, including training, materials, or tools, to providers to help them collect and report on the measures? If so, please describe. [Ask for copies of slides, materials, tools, or other resources.]

    • How are providers responding to the resources?

  1. [If MCOs are being asked to collect and report on measures] What strategy are you using to get MCOs to collect and report on measures?

  • How are MCOs responding to this strategy?

  • Did you experience any issues establishing data use and end user data agreements?

  1. [If MCOs are being asked to collect and report on measures] Are you making resources, including training, materials, or tools, available to the MCOs? If so, please describe. [Ask for copies of slides, materials, tools, or other resources.]

  • How are MCOs responding to the available resources?

  1. Are you providing resources, including training, materials, or tools, to other groups? If so, please describe. [Ask for copies of slides, materials, tools, or other resources.]

    • Why did you target these groups to receive resources?

    • How are these groups responding to the resources?

  2. Are providers, MCOs, or other individuals receiving resources, including training, materials, or tools, in addition to the resources provided through this demonstration? If so, please describe.

  • Who is providing these resources? How are they funded?

  • How are the providers, MCOs or other individuals responding to these resources?

  1. How is measure collection going?

    • How often will the measures be collected? When?

    • For what population or subpopulation of children?  E.g., All children in Medicaid and/or CHIP, only children in managed care, etc.

    • Which measures have been collected or do you suspect will be relatively easy to collect? Why?

    • Which measures are more difficult? Why?

    • What might help overcome these obstacles?

    • In any instances has the state altered or deviated from the CMS specifications?

  2. If possible, please quantify the start-up costs for collecting and reporting on the new measures.

    • How do they compare to costs of ongoing data collection and reporting?













  1. Has your state received technical assistance from a CMS contractor to help you collect and report the core measures?

    • [If TA received] What type of assistance did the state receive? How satisfied are you with the technical assistance you have received from CMS?

    • [If TA not received] Why have you not received technical assistance? Did your state not need assistance? Could you not get the type of assistance you needed?

    • Do you have additional needs for technical assistance from CMS? If so, what are they?

  1. How are you using the measures, or planning to use them?

    • To support other parts of the demonstration?

  • To prepare reports? If so, please describe. Probe as needed on report audience, contents, and reporting frequency.

  • To support changes to payment structures or as a pay for performance measure? If so, please describe. Probe as needed on selection of core measures, weighting of measures, and provider/plan response.

    • To support other agency initiatives?

  1. Before you started implementing the CHIPRA demonstration in December 2010, what quality measures including HEDIS and CAHPS measures, if any, did Medicaid and CHIP plans and/or providers in [this state] collect and report on?

    • Did plans or providers report pediatric focused measures? What measures?

  2. Before you started implementing the CHIPRA demonstration in December 2010, did providers receive reports on quality or utilization? If so, please describe.

  • What information was contained in the reports? How often did they receive the reports? To what extent did quality reports for child-serving providers include pediatric focused measures?

  • How useful did providers find these reports, to your knowledge?

  • Were these reports part of formal pay for performance (P4P) incentive programs?

  1. How has the CHIPRA quality demonstration changed the reporting requirements for Medicaid and CHIP plans and/or providers? How has it changed the reports providers receive?

    • To what extent has it increased the data collection and reporting burden on plans and/or providers? How have plans and/or providers responded to this change?

    • Do the reports come more or less frequently? Do they contain different information? Are they used for a different purpose?

  2. Are there other initiatives related to quality reporting in Medicaid and CHIP in the state that we should know about? If so, please describe.

    • To what extent do child-serving providers participate in these initiatives? (Providers could be practices, hospitals, school-based health centers, federally qualified health centers, and so forth.)

    • Are you integrating data collection and reporting for the core measures with data collection and reporting for this other initiative? If so, how?

      • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • How would you compare the impact of the CHIPRA quality demonstration to these initiatives?


Health IT Initiatives (Category B)


[NOTE: Use category B module if state has a category B project and the respondent has specialized knowledge.]


The next few questions will focus on Category B, health IT initiatives.


  1. [Researcher should briefly summarize goals mentioned previously.] What would you describe as the major goals in this area for the first year of the demonstration?

    • What goals or milestones have you most focused on achieving in this area?

  1. Please briefly describe the major strategies or approaches the state has been using this year to accomplish the goals in this area.

  2. How would you characterize this year’s progress toward those goals and milestones related to quality measurement?

    • Which, if any, have you reached or come close to reaching?

    • Which, if any, have not been reached on schedule or as planned?

    • How are you monitoring your progress?













  1. Thinking beyond this year for a moment, would you say the major strategies and outcomes in this logic model still represent your overall approach to this area of your demonstration in the long run? Are there any important differences? You’ve mentioned x and y strategy, but z from the FOP was not mentioned. Is that strategy coming later?

  • Probe as needed on hardware and software requirements, systems that will be connected with the health IT, and intended end users.

  • What is the “information” in your health information technology project—in other words, what type of information will be communicated or captured more readily once your project is implemented?

  • For what purpose will the information be communicated or captured? To reduce error or redundancy, to increase coordination or continuity?

  • Are there ways in which the project specifically addresses children with special health care needs?

  1. In your experience so far, what strategies seem to be working well? What factors seem to be contributing to progress?

    • Please describe specific tactics or processes you have used that have worked well.

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. What strategies seem to be working less well? What factors seem to be inhibiting progress?

    • Please think about tactics or processes that have been less successful.

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. How are you trying to overcome the challenges you’ve described? What approaches is the state using or considering?

  2. [If providers will implement or use new health IT] How did the project team go about recruiting and selecting providers for the health IT initiative?

    • What factors were considered in selecting providers? Their health IT sophistication? Past experience collecting and reporting core measures? Past experience implementing quality improvement initiatives?

    • If needed, probe on the number of targeted providers, the number actually recruited, and how quickly or easily they were recruited.

    • How similar or different are the participating providers from other providers in the state?

    • What was the nature of the state’s relationship with these providers before the CHIPRA demonstration began?

  1. [If providers will implement or use new health IT] What strategy are you using to motivate providers to implement and/or use the new system?

    • Are you providing financial incentives such as one-time payments, performance bonuses, changes in reimbursement?

    • Are you providing other incentives, such as free software, or technical assistance to implement the project?

    • How are providers responding to the incentives so far? Would you draw any early conclusions about the incentives from your observations?

  1. [If providers will implement or use new health IT] What resources, including training, materials, or tools, are you providing providers to help them implement the new system? [Ask for copies of slides, materials, tools, or other resources.]

    • How are providers responding to the resources?



  1. [If MCOs will implement or use new health IT] What strategy are you using to get MCOs to participate in this demonstration and make the desired changes?

  • How are MCOs responding to this strategy?

  1. [If MCOs will implement or use new health IT] Are you making resources, including training, materials, or tools, available to the MCOs? If so, please describe. [Ask for copies of slides, materials, tools, or other resources.]

  • How are MCOs responding to the available resources?

  1. Are you providing resources, including training, materials, or tools, to other providers or agencies? If so, please describe. [Ask for copies of slides, materials, tools, or other resources.]

    • Why did you target these groups to receive resources?

    • How are these groups responding to the resources?

  1. Are providers, MCOs, or other individuals receiving resources, including training, materials, or tools, in addition to the resources provided through this demonstration? If so, please describe.

  • Who is providing these resources? How are they funded?

  • How are providers, MCOs, or other individuals responding to these resources?

  1. [If implementation of new system is underway] In general, how is implementation of the new system going?

    • How are you monitoring the quality of implementation?

  2. Have you encountered any challenges so far?

  • Probe on IT infrastructure; privacy and security; interoperability.

  1. If so, how are you addressing the challenges?

  2. If possible, please quantify the start-up costs of the new system.

    • How do they compare to the ongoing costs of implementation?

  3. [If implementation of new system is underway] How are providers or other end users, such as agency staff or patients, responding to the new system?

    • Are they using the system as intended? What information is being communicated? [If data aggregated] Who is aggregating data? Who is receiving the aggregate data?

    • How satisfied are they with the usability and functionality of the new system? With the data provided by the system? Probe as needed on use with target populations, such as children with special health care needs.

    • How are they integrating the new system into their existing workflow?

  1. Before you started implementing the CHIPRA demonstration in December 2010, how would you have characterized the state’s health IT infrastructure?

    • If you were rating the infrastructure on a scale from 1 (very weak) to 10 (very strong), where would your state have been?

    • Uptake of EHRs by child-serving hospitals and physicians?

    • The health information exchange capacity within the state?

    • Usefulness of any electronic registries, such as immunization registries?

  1. Before you started implementing the CHIPRA demonstration in December 2010, what initiatives were underway to encourage change in these areas? I am thinking of federal initiatives like an HIE cooperative agreement, regional extension centers, and the EHR meaningful use incentive program.

    • To what extent do child-serving providers participate in these initiatives? (Providers could be practices, hospitals, school-based health centers, federally qualified health centers, and so forth.)

    • Are you integrating your category B project with these initiatives? If so, how?

    • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • How successful are these initiatives? What factors make these initiatives more or less successful?

    • Can you identify anything about these other initiatives that were critical to the conception, planning, or early implementation of your CHIPRA health IT project?



  1. How is the state’s ability in these areas changing?

    • To what extent would you say the CHIPRA quality demonstration is stimulating or driving these changes?

    • [Researcher will add probes if necessary to build on information provided in the state’s FOP.] How would you compare the CHIPRA quality demonstration activities to others, like HITECH dollars, plan incentive programs, or other efforts? (To the extent you can isolate the demonstration from other health IT initiatives.)

Provider-Based Initiatives (Category C)


[NOTE: Use category C module if state has a category C project and the respondent has specialized knowledge.]


The next few questions will focus on Category C, provider based models.


  1. [Researcher should briefly summarize goals mentioned previously.] What would you describe as the major goals in this area for the first year of the demonstration?

    • What goals or milestones have you most focused on achieving in this area?

  1. Please briefly describe the major strategies or approaches the state has been using this year to accomplish the goals in this area.

  2. How would you characterize this year’s progress toward those goals and milestones related to quality measurement?

    • Which, if any, have you reached or come close to reaching?

    • Which, if any, have not been reached on schedule or as planned?

    • How are you monitoring your progress?

  1. Thinking beyond this year for a moment, would you say the major strategies and outcomes in this logic model still represent your overall approach to this area of your demonstration in the long run? Are there any important differences? You’ve mentioned x and y strategy, but z from the FOP was not mentioned. Is that strategy coming later?

  2. In your experience so far, what strategies seem to be working well? What factors seem to be contributing to progress?

    • Please describe specific tactics or processes you have used that have worked well.

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.





  1. What strategies seem to be working less well? What factors seem to be inhibiting progress?

    • Please think about tactics or processes that have been less successful.

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  1. How are you trying to overcome the challenges you’ve described? What approaches is the state using or considering?



  1. How did you develop a plan for implementing the provider-based delivery model in [state]?

    • What role did different stakeholders play? How did you encourage collaboration among stakeholders, including providers and payers?

  2. How did the project team go about recruiting and selecting providers [physicians/SBHCs/CMEs] for the provider-based delivery model?

    • What factors were considered in selecting providers? Their health IT sophistication? Past experience collecting and reporting core measures? Past experience implementing quality improvement initiatives? Medical homeness at baseline?

    • If needed, probe on the number of targeted providers, the number actually recruited, and how quickly or easily they were recruited.

    • How similar or different are the participating providers from other providers in the state?

    • What was the nature of the state’s relationship with these providers before the CHIPRA demonstration began?

  1. What strategy are you using to motivate providers [physicians/SBHCs/CMEs] to participate in the demonstration and make the desired changes?

    • Are you providing financial incentives such as one-time payments, performance bonuses, changes in reimbursement?

    • Are you providing other incentives, such as free software, or technical assistance to implement the project?

    • How are providers responding to the incentives so far? Would you draw any early conclusions about the incentives from your observations?









  1. What resources, including training, materials, or tools, are you making available to the providers [physicians/SBHCs/CMEs] participating in your provider-based model? [Ask for copies of slides, materials, tools, or other resources.]

  • [If Learning Collaborative model used] Probe on content delivered in Learning Collaborative sessions.

  • How are providers responding to the available resources?

  1. How much progress are the providers [physicians/SBHCs/CMEs] making towards implementing the provider-based model?


    • How are you monitoring the quality of implementation?

    • Are they using practice-level quality data to drive these changes? If so, please describe.


  1. Are any [physicians/SBHCs/CMEs] moving forward faster with implementation?

    • What factors are making it easier for them? Characteristics of the provider? Factors external to the provider?

  1. Are any [physicians/SBHCs/CMEs] experiencing a slower start?

    • What factors are making it harder for them? Characteristics of the provider? Factors external to the provider?

  1. [If MCOs are involved in implementation] What strategy are you using to get MCOs to participate in this demonstration and make the desired changes?

  • How are MCOs responding to the demonstration?

  1. [If MCOs are involved in implementation] Are you making resources, including training, materials, or tools, available to the MCOs? If so, please describe. [Ask for copies of slides, materials, tools, or other resources.]

  • How are MCOs responding to the available resources?

  1. Are you making resources, including training, materials, or tools, available to any other groups? If so, please describe. [Ask for copies of slides, materials, tools, or other resources.]

    • Why did you target these groups to receive resources?

  • How are these groups responding to the available resources?







  1. Are providers, MCOs, or other individuals receiving resources, including training, materials, or tools, in addition to the resources provided through this demonstration? If so, please describe.


  • Who is providing these resources? How are they funded?

  • How are providers, MCOs, or other individuals responding to these resources?

  1. If possible, please quantify the start-up costs of the new delivery model.

    • How do they compare to the ongoing costs of implementation?

  2. Before you started implementing the CHIPRA demonstration in December 2010, what provider-based initiatives to improve quality of care were underway with [physicians/SBHCs/CMEs]? Please include private or commercials initiatives that you know of.

    • What was the scope of the project? What types of providers and how many participated?

    • How successful were these initiatives? What factors made these initiatives more or less successful?

    • Are you integrating your category C project with these initiatives? If so, how?

      • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • Can you identify anything about these other initiatives that were critical to the conception, planning, or early implementation of your CHIPRA health IT project?

    • How would you compare the impact of the CHIPRA quality demonstration to these initiatives?

  1. In addition to the CHIPRA demonstration, are there other new provider-based initiatives in the state to improve quality of care in [practices/SBHCs/CMEs]? If so, please describe. We're interested in initiatives planned for the next 5 years or so, even if they haven't begun yet. Again, please include private initiatives you may know of.

    • What is the scope of the new project? What types of providers and how many are participating?

    • Are you integrating your category C project with these initiatives? If so, how?

      • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • How would you compare the impact of the CHIPRA quality demonstration to these initiatives?

Pediatric Electronic Health Record (Category D)


[NOTE: Use category D module if state has a category D project and the respondent has specialized knowledge.]


The next few questions will focus on Category D, testing the model electronic health record. [The state] has completed its 12-month planning phase and is now in the infrastructure development phase. You will soon receive the EHR format, correct?]


  1. [Researcher should briefly summarize goals mentioned previously.] What would you describe as the major goals in this area for the first year of the demonstration?

    • What goals or milestones have you most focused on achieving in this area?

  2. Please briefly describe the major strategies or approaches the state has been using this year to accomplish the goals in this area.

  3. How would you characterize this year’s progress toward those goals and milestones related to quality measurement?

    • Which, if any, have you reached or come close to reaching?

    • Which, if any, have not been reached on schedule or as planned?

    • How are you monitoring your progress?

  1. Thinking beyond this year for a moment, would you say the major strategies and outcomes in this logic model still represent your overall approach to this area of your demonstration in the long run? Are there any important differences? You’ve mentioned x and y strategy, but z from the FOP was not mentioned. Is that strategy coming later?

  2. In your experience so far, what strategies seem to be working well? What factors seem to be contributing to progress?

    • Please describe specific tactics or processes you have used that have worked well.

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  3. What strategies seem to be working less well? What factors seem to be inhibiting progress?

    • Please think about tactics or processes that have been less successful.

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  4. How are you trying to overcome the challenges you’ve described? What approaches is the state using or considering?



  1. Please describe your state’s level of involvement in the development of the model EHR format.

    • Did the state have an opportunity to review and comment on the format?

  2. How satisfied are you with the level of involvement your state had in developing the model EHR format?

  3. What resources, including training, materials, or tools, has your state received to help you implement the model format?

    • Who provided these resources?

    • How satisfied are you with the resources provided?

  4. What additional resources, including training, materials, or tools, does your state need in order for your state to implement the model format?

  5. What role will vendors play in implementing the model format?

  • What vendors are you working with?

    • Have you encountered any issues with vendors? If so, how are you addressing them?

  1. How did the project team go about recruiting and selecting providers to test the model EHR?

    • What factors were considered in selecting providers? Prior EHR experience? Their health IT sophistication? Past experience collecting and reporting core measures? Past experience implementing quality improvement initiatives?

    • If needed, probe on the number of targeted providers, the number actually recruited, and how quickly or easily they were recruited.

    • How similar or different are the participating providers from other providers in the state?

    • What was the nature of the state’s relationship with these providers before the CHIPRA demonstration began?

  1. What strategy will you use to motivate providers to implement and use the EHR?

    • Will you provide financial incentives such as one-time payments, performance bonuses, changes in reimbursement? How will they compare to Medicaid incentives related to adoption and meaningful use of EHRs?

    • Will you providing other incentives, such as free software, or technical assistance to implement the project?

  2. What resources, including training, materials, or tools, will you provide providers to help them implement the new system? [Ask for copies of slides, materials, tools, or other resources.]

  3. Will providers receive resources, including training, materials, or tools, in addition to the resources provided through this demonstration? If so, please describe.

  • Who will provide these resources? How are they funded?

  1. Have you encountered any issues related to sufficient IT infrastructure for implementing the EHR? If so, how are you addressing them?

  2. Have you encountered any issues around interoperability, the ability to use the new EHR with existing systems? If so, how are you addressing them?

  3. Have you encountered any other challenges? If so, how are you addressing them?

  4. If possible, please quantify the start-up costs of the new system.

  5. Before you started implementing the CHIPRA demonstration, how would you have characterized the health IT capacity of the providers testing the model EHR?

    • IT infrastructure?

    • Experience implementing and/or using health IT?

  1. Before you started implementing the CHIPRA demonstration, what activities were underway with participating providers to improve their health IT capacity?

    • How successful were these initiatives? What factors made these initiatives more or less successful?

    • Are you integrating your category D project with these initiatives? If so, how?

      • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • Can you identify anything about these other initiatives that were critical to the conception, planning, or early implementation of your Category D project?

  1. How are the providers’ capacities in these areas changing?

    • To what extent would you say the CHIPRA quality demonstration is stimulating or driving these changes?

    • [Researcher will add probes if necessary to build on information provided in the state’s FOP.] How would you compare the CHIPRA quality demonstration activities to others, like HITECH dollars, plan incentive programs, or other efforts? (To the extent you can isolate the demonstration from other health IT initiatives.)







Other Quality Initiatives (Category E)


[NOTE: Use category E module if state has a category E project and the respondent has specialized knowledge.]


The next few questions will focus on Category E, other quality initiatives.


  1. [Researcher should briefly summarize goals mentioned previously.] What would you describe as the major goals in this area for the first year of the demonstration?

    • What goals or milestones have you most focused on achieving in this area?

  2. Please briefly describe the major strategies or approaches the state has been using this year to accomplish the goals in this area.

  3. How would you characterize this year’s progress toward those goals and milestones related to quality measurement?

    • Which, if any, have you reached or come close to reaching?

    • Which, if any, have not been reached on schedule or as planned?

    • How are you monitoring your progress?

  1. Thinking beyond this year for a moment, would you say the major strategies and outcomes in this logic model still represent your overall approach to this area of your demonstration in the long run? Are there any important differences? You’ve mentioned x and y strategy, but z from the FOP was not mentioned. Is that strategy coming later?

  2. In your experience so far, what strategies seem to be working well? What factors seem to be contributing to progress?

    • Please describe specific tactics or processes you have used that have worked well.

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  3. What strategies seem to be working less well? What factors seem to be inhibiting progress?

    • Please think about tactics or processes that have been less successful.

    • Probe as needed on skills and experience of staff, effective planning, high-level support, other external factors.

  4. How are you trying to overcome the challenges you’ve described? What approaches is the state using or considering?

  5. How did you develop a plan for implementing the Category E project?

    • What role did different stakeholders play? How did you encourage collaboration among stakeholders, including providers and payers?



  1. How are you encouraging participation in the Category E project?

    • How are intended participants [stakeholders/providers/other states] responding to this method?

  2. What strategy are you using to motivate intended participants [providers/stakeholders/other states] to participate in the demonstration and make the desired changes?

    • How are intended participants responding to the incentives so far?

  1. What resources, including training, materials, or tools, are you making available to intended participants [providers/stakeholders/other states]? [Ask for copies of slides, materials, tools, or other resources.]

  • How are intended participants responding to the available resources?

  1. Are intended participants receiving resources, including training, materials, or tools, in addition to the resources provided through this demonstration? If so, please describe.


  • Who is providing these resources? How are they funded?

  • How are the intended participants responding to these resources?

  1. If possible, please quantify the start-up costs of the new initiative.

  • How do they compare to the costs of keeping the initiative going?

  1. Before you started implementing the CHIPRA demonstration in December 2010, were there other similar initiatives in the state? If so, please describe.

    • How successful were these initiatives? What factors made these initiatives more or less successful?

    • Are you integrating your category E project with these initiatives? If so, how?

      • Has the CHIPRA demonstration increased or decreased the amount of time, attention, and/or financial resources available to these initiatives? If so, please describe.

    • Can you identify anything about these other initiatives that were critical to the conception, planning, or early implementation of your Category E project?

    • How would you compare the impact of the CHIPRA quality demonstration to these initiatives?









Section Wrap-Up

  1. [If applicable for this state]: What benefits do you see from implementing multiple projects at once under this CHIPRA grant?

  • Have there been any unanticipated consequences so far, good or not?

  • How will implementing one project help another project?

  • Conversely, have there been any unanticipated "costs" or negative impacts of implementing these multiple projects at once?

      • [Prompt]: For example, are there too much projects going on at once to do all of them well?

      • [Prompt]: Are problems in one area making it difficult to make progress in another?



IV. Indications of Change

I’d like to ask you about the [physician practices/health systems/other health care providers] that have been participating in the demonstration to date. To what extent have you noticed changes in:


  1. Provider motivation to participate? Which providers are more motivated? Less motivated? Please provide examples.


  1. Provider knowledge and skills? Which providers are improving their knowledge and skills? Which providers have not changed much in this area as of yet? Please provide examples.

  1. Attitudes and behaviors? To what extent have providers’ attitudes about things like CHPRA core measures, EHR/HIE, or medical homes changed?, How about everyday behaviors that move to implement changes in these areas? Please provide examples.


    • How are you monitoring changes in provider attitudes and behaviors?


How about Medicaid and CHIP MCOs in the state? What were your expectations about their involvement in the demonstration projects, and how has their role played out so far? To what extent have you noticed changes in:


  1. MCO motivation to participate? Which MCOs are more motivated? Less motivated? Please provide examples.


  1. Attitudes and behaviors? To what extent have MCOs’ attitudes about things like CHPRA core measures, EHR/HIE, or medical homes changed?, How about everyday behaviors that move to implement changes in these areas? Please provide examples.


    • How are you monitoring changes in MCO attitudes and behaviors?


V. Stakeholder Involvement

Now, I would like to talk about the involvement of other demonstration stakeholders, such as professional associations, state AAP chapters, and consumer advocacy groups that may play an advisory role on your projects.


  1. In what ways did you plan to involve these stakeholders in your demonstration projects and how has that played out so far?


  1. How satisfied have you been with the level of stakeholder involvement in the demonstration?


    • Has the level of involvement changed since the demonstration started?


    • Probe as needed on involvement in each category.


  1. How has stakeholder involvement facilitated or created challenges to implementing the demonstration?


  1. What changes, if any, do you plan to make to your strategy for involving stakeholders?


    • Involve additional stakeholders? Different stakeholders? Engage them in a different manner? Use a different method to encourage collaboration?


Now, I’d like to ask you to comment on the response of stakeholders to the quality demonstration to date. Specifically, to what extent have you noticed changes in:


  1. Stakeholder attitudes and behaviors? To what extent have stakeholder’s attitudes about things like CHPRA core measures, EHR/HIE, or medical homes changed? How about everyday behaviors that move to implement changes in these areas? Please provide examples.


    • How are you monitoring changes in stakeholder attitudes and behaviors?

  1. Please describe any stakeholder resistance, or skepticism you may have encountered or heard about to date.


  • What strategy or strategies have you used or considered to overcome the resistance or skepticism?



VI. Lessons Learned and Plans

  1. Given your experience so far in the demonstration, what lessons have you learned or what insights and advice might you have for other states?

  • Any overarching or specific lessons you’d like to share are fine.

  • Is there anything you would do differently if you could?

  1. What should other states be aware of in designing and implementing these kinds of quality initiatives?

  2. How do you plan to apply the lessons you’ve learned so far to future implementation efforts?

  3. What are the state’s major demonstration goals for the coming year?

  4. What factors do you see as most critical for your success over the next year?



IX. Wrap Up

  1. You have answered all my questions. Is there anything I didn’t ask that you’d like to tell me about?

Thank you very much for making time to speak with us.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLeslie Foster
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy