Protocol Changes

5-11-12 Final Edits to EQR Draft Protocols based on Public Comments.docx

External Quality Review of Medicaid MCOs and Supporting Regulations in 42 CFR 438.360, 438.362, and 438.364 (CMS-R-305)

Protocol Changes

OMB: 0938-0786

Document [docx]
Download: docx | pdf

Final Edits to EQR Draft Protocols based on Public Comments Received February 17-April 17, 2012


EDIT – External Quality Review Background – page 3 – added recommendation on the opportunity for States to have EQR Technical Reports available to CMS and the public by April of each year to improve accuracy of managed care data reported in the Secretary’s Annual Report on Quality each September.


The Children’s Health Insurance Program Reauthorization Act of 2009 (CHIPRA) requires that Children’s Health Insurance Program (CHIP) managed care plans also participate in external quality review (EQR). CHIPRA Section 403, adds managed care requirements applicable to Medicaid under §§1932(a)(4), (a)(5), (b), (c), (d), and (e), to States contracting with MCOs for the delivery of care under separate CHIP programs. These provisions apply to contract years for managed care plans beginning on or after July 1, 2009. Section 401(c)(1) of CHIPRA requires each State to annually report on its child health quality measures and other State-specific information, including information collected through EQRs. CMS strongly encourages States to have final EQR Technical Reports available to CMS and the public by April of each year, for data collected within the prior 15 months. This submission timeframe will align with the collection and annual reporting on managed care data by the Secretary each September 30th, which is also required under the Affordable Care Act [Sec. 2701 (d)(2)]. In 2010, the Secretary began an analysis and publication of information obtained from this annual data. In addition to the inclusion of EQR information in annual reports, EQR information will be part of the Secretary’s annual report to Congress on children’s healthcare quality issues.



EDIT- Protocol 1 – Compliance Review pages 3-4; added recommendation on the opportunity for States to have EQR Technical Reports available to CMS and the public by April of each year to improve accuracy of managed care data reported in the Secretary’s Annual Report on Quality each September.


Results of the MCO’s compliance review may be reported in the annual Secretary’s Report on the Quality of Care for Children in Medicaid and CHIP or the annual Secretary’s Report on the Quality of Care for Adults in Medicaid. These reports are released every September and information that is not available from a State’s EQR report will may be so noted in the reports. Both reports will be available on the CMS Medicaid website. CMS strongly encourages States to have final EQR Technical Reports available to CMS and the public by April of each year, for data collected within the prior15 months. This submission timeframe aligns with the collection and annual reporting on managed care data by the Secretary each September 30th, as required under the Children’s Health Insurance Program Reauthorization Act (CHIPRA) [Sec. 401 (c)(2)] and the Affordable Care Act [Sec. 2701 (d)(2)].





EDIT - Protocol 1 – Compliance Review - page 8, added notation for EQRO discretion on attendees for interviews


Interview Participants

Interviews should be conducted with groups, rather than with single individuals, because rarely does one individual have sole responsibility for a particular function. Interview groups should include participants that represent different functions, services, or departments of the MCO to enable the reviewer to collect multiple perspectives about an issue. Group interviews are also an opportunity for MCO staff to learn about compliance activities in other departments. The EQRO has discretion to meet with less than the full list of MCO recommended employees in situations where the EQRO feels that it can obtain the required information without the attendance of all MCO employees listed in the Protocol, or the MCO has identified a more appropriate person to address questions but is not on the recommended list. Attachment D includes compliance review questions for the following groups:

  • MCO leaders;

  • MCO information systems staff;

  • Quality assessment and performance improvement program staff;

  • Provider/ contractor services staff;

  • Enrollee services staff, including grievance and appeal staff;

  • Utilization management staff;

  • Medical Director(s);

  • Case managers and care coordinators; and

  • MCO providers and contractors, as appropriate and as time and resources permit.



EDITS – Protocol 1 – Compliance Review - page 9, clarified EQRO review of a State’s HIT plan would be in respect to validation of performance measure or performance improvement project activities


Interviews & Systems

States have the opportunity to expand the roles of other State agencies in terms of their responsibilities related to data exchanges, EHRs, interoperability, care coordination, and Medicaid or CHIP waivers. At the State’s discretion, it may determine:


  • Whether the EQRO will review the State’s health information technology (HIT) plan for HITECH and meaningful use with respect to validation of performance measures or performance improvement project activities; and

  • How the MCO’s systems will support State efforts in a valid way.


Also refer to Appendix V – Information Systems Capabilities Assessment




EDIT - Protocol 1 – Compliance Review Appendix D, page 1: similar edit for EQRO discretion on interview attendees


MCO Leaders


The leadership interview is an opportunity to talk with the senior representatives of the MCO about their understanding and practice of the following MCO requirements. In attendance, the following MCO leaders should be present during the interview(s), with discretion from the EQRO as needed on the availability of documented information (or other appropriate staff) when recommended attendee participation is burdensome or difficult to schedule:




EDIT – Protocol 2 – Validation of Performance Measures - Purpose, page 3 – added notation for State to consider including performance measure outcome and trending results as part of EQR Technical reporting.


Results of the MCO’s performance measures may be reported in the annual Secretary’s Report on the Quality of Care for Children in Medicaid and CHIP or the annual Secretary’s Report on the Quality of Care for Adults in Medicaid. These reports are released every September and information that is not available from a State’s EQR report will may be so noted in the reports. Both reports will be available on the CMS Medicaid website. States are strongly encouraged to have EQROs include outcome and trending information of performance measures reported in the annual EQR technical report.



EDIT – Protocol 3 – – Validation of Performance Improvement Projects - Purpose, page 3 – added notation for State to consider including PIP outcome and trending results as part of EQR Technical reporting.


Results of the MCO’s PIPs may be reported in the annual Secretary’s Report on the Quality of Care for Children in Medicaid and CHIP or the annual Secretary’s Report on the Quality of Care for Adults in Medicaid. These reports are released every September and information that is not available from a State’s EQR report will may be so noted in the reports. Both reports will be available on the CMS Medicaid website. States are strongly encouraged to have EQROs include PIP outcome and trending information reported in the EQR technical report. This will enable the Secretary to include results and lessons learned from State intervention strategies to improve care as part of that annual reporting process.




EDIT – Protocol 3 – – Validation of Performance Improvement Projects - Purpose, page 3 – added notation for State to consider including MCO input to PIP study topic and methodologies


Additionally, States may incorporate specific PIPs as part of their State quality strategy, required by Section 1932(c)(1) of the Social Security Act, to align with the HHS National Quality Strategy for Quality Improvement in Health Care. When doing so, soliciting input from participating MCOs/PIHPs in the identification of PIP topics and methodologies may be helpful so that relevant clinical, administrative and population-based improvement efforts are addressed as part of the State’s overall strategy to improve health care delivery and outcomes of the people it serves.



EDIT – Protocol 3 – – Validation of Performance Improvement Projects - Step 2 – Potential Source of Supporting Information – the bullets on page 6 expanded examples for more enrollee input

Potential Sources of Supporting Information:

  • QI study documentation

  • Relevant clinical literature

  • Enrollee focus groups/surveys

  • Enrollee/provider representatives on Quality Committees



EDITS - Protocol 3 - – Validation of Performance Improvement Projects - Section 4 has now been moved up to page 6 and has become Section 3, Table of contents revised accordingly:

Step 3: Review the Identified Study Population


Measurement and improvement efforts must be system-wide.


Criteria

The PIP must clearly identify the ‘system’ or population, also referred to as the universe. Once the population is identified, the MCO will determine whether to study data for the entire population or a sample of that population. A representative sample of the identified population is acceptable (see Step 5).


Potential Sources of Supporting Information

Data on the MCO’s enrolled population as well as enrollee counts relevant to the study topic and measures. This includes:

  • Demographic information from the MCO’s enrollment files

  • The MCO’s utilization and outcome information such as:

      • Services

      • Procedures

      • Admitting and encounter diagnoses

      • Adverse incidents (e.g., deaths, avoidable admissions, readmissions)

      • Patterns of referrals

      • Authorization requests

  • Other databases, as needed (e.g., pharmacy claims data to identify patients taking a specific medication(s) during a specific enrollment period).


Assessment

Review the study description and methodology to determine if the study clearly identified the study population. Consider the following questions:


  1. How was the “at risk” population defined?

  2. Are all individuals clearly defined in terms of the identified study question(s) and relevant indicators?

  3. Is the entire study population or a sample used? If the organization is able to collect and analyze data through an automated data system, it is possible to study the whole population? If the data must be collected manually, sampling may be more realistic.

  4. Did the definition of the study population include requirements for the length of the study populations’ members’ enrollment in the MCO? The required length of time will vary depending on the study topic and study indicators.

  5. If the entire population was studied, did the data collection approach capture all enrollees to whom the study question applied?

  6. If a sample was used, go to Step 5. If the entire population was studied, skip Step 5 and go to Step 6. If HEDIS® measures and sampling methodology is used, go to
    Step 7.

Step 4: Review the Selected Study Variables



EDITS Protocol 3 – Validation of Performance Improvement Projects - Page 15 – Section 8 has been moved up to page 16 and has become Section 7, Table of contents revised accordingly , and second bullet under Assessment – C has been corrected to correct the term “unambiguous”:


Step 78: Review Data Analysis and Interpretation of Study Results


In this step, the reviewer determines the accuracy of the MCO’s plan for analyzing and interpreting the PIP’s results. Accurate PIP data analysis is critical because the MCO will implement changes in treatment and operations based on the results of a PIP.


Criteria

The review examines the appropriateness of, and the adherence to, the statistical analysis techniques defined in the data analysis plan. Interpretation and analysis of the study data should be based on continuous improvement philosophies and reflect an understanding that most problems result from failures of administrative or delivery system processes. Interpreting the data should involve developing hypotheses about the causes of less-than-optimal performance and collecting data to validate the hypotheses.


Potential Sources of Supporting Information

  • Baseline project indicator measurements

  • Repeat project indicator measurements

  • Industry benchmarks

  • Analytic reports of PIP results by the MCO


Assessment

Examine the calculated plan performance on the selected measures. To review the data analysis and results of the study, consider the following:


  1. Is the analysis of the findings conducted in accordance with the data analysis plan?


  1. Are numerical results and findings presented in an accurate, clear, and easily understood manner?


  1. Does the analysis identify:


  • Initial and repeat measurements of project outcomes?

  • Realistic and unambiguous unambitious targets for the measures?

  • The statistical significance of any differences between the initial and repeat measurements?

  • Factors that influence the comparability of initial and repeat measurements?

  • Factors that threaten the internal or external validity of the findings?


  1. Does the analysis of the study data include an interpretation of the extent to which its PIP is successful and what follow-up activities are planned as a result?


Step 87: Assess the MCO’s Improvement Strategies



EDITS Protocol 3 – – Validation of Performance Improvement Projects - replaced the word “variable” to “indicator” for consistency in terminology – as defined in the Glossary, multiple pages

  • Table of contents, page 2

  • Activity 1, page 3

  • Step 4: Review the Selected Study Indicators, page 7

  • Assessment, page 8

  • Section E – Study design and quantitative data section – page 14



EDIT Protocol 3 – – Validation of Performance Improvement Projects page 17 – removed duplicate question

  1. Are there any documented improvements in processes or outcomes of care?


  1. Does the reported improvement in performance have “face” validity (i.e., on the face of it, does the intervention appear to have been successful in improving performance)?


  1. Does the improvement in performance appear to be the result of the planned quality improvement intervention?


  1. Is there any statistical evidence that any observed performance improvement is true improvement?



EDITS Protocol 3 – Validation of Performance Improvement Projects - new section 8 – added reference for culturally and linguistically appropriate services


C. Are the interventions culturally and linguistically appropriate? For example, a mailing in English at 12th grade level to members of a predominately Chinese language population would not be appropriate. More information on culturally and linguistically appropriate services may be found at the following website: http://minorityhealth.hhs.gov/templates/browse.aspx?lvl=2&lvlID=15.



EDIT Protocol 4 – Validation of Encounter Data - clarified language that 75% match applies to any EQR Protocol activity


States may contract with EQROs for mandatory or voluntary activities at the 75 percent Federal match rate. While the validation of encounter data is voluntary, CMS strongly encourages States to contract with EQROs to implement this particular protocol at the 75 percent Federal match rate due to the need for overall valid and reliable encounter data as part of any State quality improvement efforts.



EDIT Protocol 5 – Validation and Implementation of Surveys - adding concluding recommendation that States contract with EQROs to include results of HIT/EHR initiatives in annual EQR technical reports


In order to learn from and share State experiences with emerging HIT and EHR initiatives that can impact reporting of performance measure and performance improvement project outcomes, CMS strongly encourages States to contract with EQROs to include results of State HIT and EHR initiatives in annual EQR reports. This may include successful implementation of health information exchange with other State agencies to improve data source collection efforts for performance measures or performance improvement projects. Similarly, including lessons learned from challenging or unsuccessful HIT initiatives are just as informative to Federal and other State partners, and may be a valuable source of information to be included in the Annual Secretary’s Report on Quality published each September.



EDIT Protocol 7 - Implementation of Performance Improvement Projects – added clarification to the introduction on purpose and options for working with EQROs for this voluntary protocol


The purpose of this protocol is to provide guidance to EQROs conducting optional Performance Improvement Projects (PIPs) for the State. Federal regulations at 42 C.F.R. § 438.240(d) require MCOs to conduct a PIP, which must be validated by an EQR using Protocol 3: Validating Performance Improvement Projects. States may also chose to have the EQRO conduct additional PIPs to assess and improve processes and outcomes of care provided by MCOs in the State. Study topics can align with Federal initiatives such as Partnership for Patients or the Million Hearts Campaign. States also have the option to have the EQRO provide technical assistance on study or analytic methodologies to support MCO efforts in this area. It is also recommended that study questions consider the three aims of the National Quality Strategy:

  • Better care for patients and families,

  • Improved health for communities and populations, and

  • Affordable health care.




EDIT Protocol 7 – Implementation of Performance Improvement Projects - revised ordering of activities 3 and 4, 7 and 8, to compliment changes made to Protocol 3 on Validation of Performance Improvement Projects. Table of contents revised accordingly.


  1. ACTIVITY 34: USE A REPRESENTATIVE AND GENERALIZABLE

STUDY SAMPLE


Measurement and improvement efforts must be system-wide. The PIP must clearly identify the ‘system’ or study population, also referred to as the universe. Once the population is identified, the MCO will determine whether to study data for the entire population or a sample of that population. A representative sample of the identified population is acceptable. See Protocol 3, Activity 1, Step 4 for information about how an EQRO validates an appropriate study population.



  1. ACTIVITY 43: SELECT THE STUDY VARIABLES



  1. ACTIVITY 78: ANALYZE DATA AND INTERPRET STUDY RESULTS


Data analysis begins with examining the performance on the selected clinical or non-clinical indicators. The examination should be initiated using statistical analysis techniques defined in the data analysis plan. For detailed guidance, follow the criteria outlined in Protocol 3, Activity 1, Step 8.


  1. ACTIVITY 87: IMPLEMENT INTERVENTION AND IMPROVEMENT STRATEGIES


Page 11 of 11



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCMS
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy