(Private Sector) External Quality Review of Medicaid MCOs and Supporting Regulations in 42 CFR 438.360, 438.362, and 438.364

External Quality Review of Medicaid MCOs and Supporting Regulations in 42 CFR 438.360, 438.362, and 438.364 (CMS-R-305)

08. P2AttachA-PM Validation Wksht

(Private Sector) External Quality Review of Medicaid MCOs and Supporting Regulations in 42 CFR 438.360, 438.362, and 438.364

OMB: 0938-0786

Document [pdf]
Download: pdf | pdf
EQR PROTOCOL 2 – Validation of Performance Measures Reported
by the MCO
Attachment A: Performance Measure Validation Worksheets
The data tables in this Attachment are designed to assist the EQRO in conducting Protocol 2 for
validation of performance measures reported by the MCO.

Worksheet 1: Performance Measures Collected by the MCO
METHOD FOR CALCULATING
PERFORMANCE MEASURE
CHIPRA Child Initial Core Set Quality Measures
SAMPLE MEASURES

Admin.
Data

Medical
Record
Review

Hybrid EHR Survey

Reporting
Frequency
and Format

Prenatal and Postpartum Care:
Timeliness of Prenatal Care
Frequency of Ongoing Prenatal
Care
Percentage of Live Births Weighing
Less Than 2,500 Grams
Cesarean Rate for Nulliparous
Singleton Vertex
Childhood Immunization Status
Immunization for Adolescents
Weight Assessment and
Counseling for Nutrition and
Physical Activity for Children/
Adolescents: Body Mass Index
Assessment for Children/
Adolescents
Developmental Screening In the
First Three Years of Life
Chlamydia Screening

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it
displays a valid OMB control number. The valid OMB control number for this information collection is 0938-0786. The time required
to complete this information collection is estimated to average 1,591 hours per response for all activities, including the time to review
instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

1

have comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: CMS, 7500
Security Boulevard, Attn: PRA Reports Clearance Officer, Baltimore, Maryland 21244-1850

SAMPLE MEASURES

Admin.
Data

Medical
Record
Review

Hybrid

EHR

Survey

Reporting
Frequency
and Format

Well-Child Visits in the First 15
Months of Life
rd

th

Well-Child Visits in the 3 , 4 ,
th
th
5 , and 6 Years of Life
Adolescent Well-Care Visit
Percentage of Eligibles Who
Received Preventive Dental
Services
Child and Adolescent Access to
Primary Care Practitioners
Appropriate Testing for Children
with Pharyngitis
Otitis Media with Effusion (OME)
– Avoidance of Inappropriate
Use of Systemic Antimicrobials in
Children
Percentage of Eligibles who
Received Dental Treatment
Services
Ambulatory Care: Emergency
Department Visits
Pediatric Central-line Associated
Blood Stream Infections –
Neonatal Intensive Care Unit and
Pediatric Intensive Care Unit
Annual Percentage of Asthma
Patients 2 Through 20 Years Old
with One or More AsthmaRelated Emergency Room Visits
Follow-Up Care for Children
Prescribed Attention Deficit
Hyperactivity Disorder (ADHD)
Medication
Annual Pediatric Hemoglobin
A1C Testing
Follow-up After Hospitalization
for Mental Illness

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

2

SAMPLE MEASURES

Admin.
Data

Medical
Record
Review

Hybrid

EHR

Survey

Reporting
Frequency
and Format

Hybrid

EHR

Survey

Reporting
Frequency
and Format

CAHPS® 4.0 (Child Version
Including Medicaid and Children
with Chronic Conditions
Supplemental Items)

Adult Medicaid Initial Core Set Quality Measures
SAMPLE MEASURES

Admin.
Data

Medical
Record
Review

Flu Shots for Adults Ages 50-64
(Collected as part of HEDIS
CAHPS Supplemental Survey)
Adult BMI Assessment
Breast Cancer Screening
Cervical Cancer Screening
Medical Assistance With
Smoking and Tobacco Use
Cessation (Collected as part of
HEDIS CAHPS Supplemental
Survey)
Screening for Clinical Depression
and Follow-Up Plan
Plan All-Cause Readmission
PQI 01: Diabetes, Short-term
Complications Admission Rate
PQI 05: Chronic Obstructive
Pulmonary Disease (COPD)
Admission Rate
PQI 08: Congestive Heart Failure
Admission Rate
PQI 15: Adult Asthma Admission
Rate
Chlamydia Screening in Women
age 21-24
Follow-Up After Hospitalization
for Mental Illness
EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

3

SAMPLE MEASURES

Admin.
Data

Medical
Record
Review

Hybrid

EHR

Survey

Reporting
Frequency
and Format

PC-01: Elective Delivery
PC-03 Antenatal Steroids
Controlling High Blood Pressure
Comprehensive Diabetes Care:
LDL-C Screening
Annual HIV/AIDS medical visit
Comprehensive Diabetes Care:
Hemoglobin A1c Testing
Antidepressant Medication
Management
Adherence to Antipsychotics for
Individuals with Schizophrenia
Annual Monitoring for Patients on
Persistent Medications
CAHPS Health Plan Survey v 4.0 Adult Questionnaire with CAHPS
Health Plan Survey v 4.0H - NCQA
Supplemental
Care Transition – Transition Record
Transmitted to Health care
Professional
Initiation and Engagement of
Alcohol and Other Drug Dependence
Treatment
Prenatal and Postpartum Care:
Postpartum Care Rate

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

4

Worksheet 2: Performance Measure Validation Worksheet Template
PERFORMANCE MEASURE {Insert name of performance measure}

(Meets Validation Requirements)
Validation
Component
Documentation

Denominator

Numerator

Sampling

Audit Element

Yes

No

N/A

Appropriate and complete measurement plans and
programming specifications exist that include data sources,
programming logic, and computer source code.
Data sources used to calculate the denominator (e.g.,
eligibility files, claims files, provider files, pharmacy records)
were complete and accurate.
Calculation of the performance measure adhered to the
specifications for all components of the denominator of the
performance measure (e.g., member ID, age, sex,
continuous enrollment calculation, clinical codes such as
ICD-9 or ICD-10, CPT-4, DRGs, UB-92, member months
calculation, member years calculation, and adherence to
specified time parameters).
Data sources used to calculate the numerator (e.g.,
member ID, claims files, medical records, provider files,
pharmacy records, including those for members who
received the services outside the MCO’s network) are
complete and accurate.
Calculation of the performance measure adhered to the
specifications for all components of the numerator of the
performance measure (e.g., clinical codes such as ICD-9 or
ICD-10, CPT-4, LOINC, DRGs, pharmacy data, relevant
time parameters such as admission/discharge dates or
treatment start and stop dates, adherence to specified time
parameters, number or type of provider).
If medical record abstraction was used,
documentation/tools were adequate.
If hybrid method was used, the integration of administrative
and medical record data was adequate.
If hybrid method or solely medical record review was used,
the results of the medical record review validation
substantiate the reported numerator.
Sample was unbiased.
Sample treated all measures independently.
Sample size and replacement methodologies met
specifications.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

5

Validation
Component
Reporting

Audit Element

Yes

No

N/A

State specifications for reporting performance measures
were followed.

Below is an example of a completed, customized performance measure validation worksheet
similar to what an EQRO would prepare prior to its onsite visit. This worksheet assumes that the
State has adopted the HEDIS® methodology for this performance measure. One of the following
scoring designations must be checked for each audit element:
MET: The MCO’s measurement and reporting process was fully compliant with State
specifications.
NOT MET: The MCO’s measurement and reporting process was not compliant with
State specifications. This designation should be used for any audit element that deviates
from the State specifications, regardless of the impact of the deviation on the final rate.
All audit elements with this designation must include explanation of the deviation in the
comments section.
N/A: The audit element was not applicable to the MCO’s measurement and reporting
process.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

6

PERFORMANCE MEASURE TO BE VALIDATED: CHLAMYDIA SCREENING IN WOMEN
METHODOLOGY FOR CALCULATING MEASURE: (Check one):
[

] Administrative

AUDIT ELEMENTS

[ ]

Medical Record Review

AUDIT SPECIFICATIONS

MET

NOT
MET

[ ] Hybrid

N/A

COMMENTS

DENOMINATOR

1. Population

•

•

2. Geographic
Area

•

3. Age & Sex

•

4. Enrollment
Calculation

•
•
•

•

5. Event/Diagnosis

•

Medicaid population
appropriately
segregated from
commercial /
Medicare.
Population defined as
effective Medicaid
enrollment as of Dec.
31 of the
measurement year.
Includes only those
Medicaid enrollees
served in the MCO’s
reporting area.
Members aged 1625as of 12/31 of the
measurement year
Only females selected
Were members of
MCO on 12/31 of the
measurement year
Were continuously
enrolled from 1/1 to
12/31 of the
measurement year
with no more than
one break of up to 45
days allowed.
Switches between
populations
(Medicaid, CHIP and
commercial) were not
counted as breaks.
Sexually active based
on pharmacy and
claims/encounter data

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

7

AUDIT ELEMENTS

AUDIT SPECIFICATIONS

6. Data Quality

•

7. Proper
Exclusion
Methodology in
Administrative
Data (If no
exclusions were
taken, check
N/A)

•

8. Administrative
Data: Counting
Clinical Events

•

•

•

•

9. Medical Record
Review
Documentation
Standards
10. Time Period

•

•

MET

NOT
MET

N/A

COMMENTS

Based on the
information system
assessment findings,
are any of the data
sources for this
denominator
inaccurate?
Only members with
allowed
Exclusions were
performed according
to current State
specifications.
Only the codes listed
in specifications as
defined by State were
counted as
exclusions.
Standard codes listed
in State specifications
or properly mapped
internally developed
codes were used.
(Intended to
reference appropriate
specifications as
defined by State.)
Members were
counted only once;
double counting was
prevented.
NA

Service performed
between 1/1 to 2/31
of the measurement
year.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

8

AUDIT ELEMENTS

AUDIT SPECIFICATIONS

MET

NOT
MET

N/A

COMMENTS

Properly identified
enrollees. Based on
the information
system assessment
findings, were any of
the data sources
used for this
numerator inaccurate.

11. Data Quality

SAMPLING
IF ADMINISTRATIVE METHOD WAS USED, CHECK “N/A” FOR AUDIT ELEMENTS 11, 12,
AND 13.
AUDIT
ELEMENTS

AUDIT SPECIFICATIONS

12. Unbiased
Sample

•

13. Sample Size

•

14. Proper
Substitution
Methodology
in Medical
Record
Review (If no
exclusions
were taken,
check NA)

•

•

MET

NOT
MET

N/A

COMMENTS

As specified in State
specifications,
systematic sampling
method was utilized.
After exclusions,
sample size is equal to
1) 411, 2) the
appropriately reduced
sample size, which
used the current year’s
administrative rate or
preceding year’s
reported rate, or 3) the
total population.
Only excluded
members for whom
medical record review
revealed 1)
contraindications that
correspond to the
codes listed in
appropriate
specifications as
defined by State or 2)
data errors.
Substitutions were
made for properly
excluded records and
the percentage of
substituted records was
documented.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

9

Additional Questions
QUESTIONS

YES

NO

Were members excluded for contraindications found in the administrative
data?
Were members excluded for contraindications found during the medical record
review?
Were internally developed codes used?
Were internally developed codes used?
What range defines the impact of data incompleteness for this measure?
(Check one.)
0 - 5 percentage points
>5 - 10 percentage points
>10 - 20 percentage points
>20 - 40 percentage points
>40 percentage points
Unable to Determine
What is the direction of the bias? Check one:

OVER-REPORTING
UNDER-REPORTING
Upon what documentation is the above percentage based? (e.g., internal reports,
studies, comparison to medical records, etc.

Validation Finding
The validation finding for each measure is determined by the magnitude of the errors detected
for the audit elements, not by the number of audit elements determined to be “NOT MET”.
Consequently, it is possible that an error for a single audit element may result in a designation
of “NR” because the impact of the error biased the reported performance measure by more than
“x” percentage points. Conversely, it is also possible that several audit element errors may
have little impact on the reported rate and, thus the measure could be given a designation of
“R.” The following is a list of the validation findings and their corresponding definitions:
R

=

Report
Measure was compliant with State specifications.

NR

=

Not Reported
This designation is assigned to measures for which: 1) MCO rate was materially
biased or 3) the MCO was not required to report.

NB

=

No Benefit
Measure was not reported because the MCO did not offer the benefit required by
the measure.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

10

AUDIT DESIGNATION

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

11

Worksheet 3: Potential Documents and Processes for Review
In order to assess the MCO’s information system and the validity of reported performance
measures, the EQRO will need to review a number of data sources and processes. The EQRO
should ask the MCO to make available the following documents, data, and procedures to the
EQRO for observation; the EQRO will use its discretion in selecting which ones to review.
Integration and Control of Data
• Procedures and standards for all aspects of the data repository(ies) used in the
production of performance measures, including building, maintaining, managing, testing,
and production of performance measures.
• Manuals covering application system development methodology, database
development, and design and decision support system utilization.
• Control system documentation including flow charts and codes for backups, recovery,
archiving, and other control functions.
• Procedures to consolidate information from disparate transaction files.
• Record and file formats and descriptions, for entry, intermediate, and repository files.
• Electronic formats and protocols.
• Electronic transmission procedures documentation.
• Processes to extract information from the repository(ies).
• Source code data entry, data transfer, and data manipulation programs and processes.
• Descriptive documentation for data entry, transfer, and manipulation programs and
processes.
• If applicable, procedures for coordinating vendor activities to safeguard the integrity of
the performance measurement data.
• Samples of data from repository and transaction files to assess accuracy and
completeness of the transfer process.
• Comparison of actual results from file consolidation and data abstracts to those which
should have resulted according to documented algorithms.
• Documentation of data flow among vendors to assess the extent to which there has
been proper implementation of procedures to safeguard the integrity of the performance
measure data.
• Documentation of data cutoff dates.
• Documentation of proper run controls and of staff review of report runs.
• Copies of files and databases used for performance measure calculation and reporting.
• Procedures governing production process for MCO performance measures, including
standards and schedules.
Collection, Calculation, and Documentation of Performance Measurements
•
•
•

Policies for the documentation of data requirements, issues, validation efforts, and
results.
A project or measurement plan for each performance measure.
Documentation of programming specifications, including work flow, data sources, and
uses which include diagrammatic or narrative descriptions.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

12

•
•
•
•

•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•

Documentation of the original universe of data that includes record-level patient
identifiers that can be used to validate entire programming logic for creating
denominators, numerators, and samples.
Documentation of computer queries, programming logic, or source code used to create
final denominators, numerators, and interim data files.
Documentation that includes dated job log or computer run for denominators and
numerators, with record counts for each programming step and iteration.
Documentation of medical record review including: qualifications of medical record
review supervisor and staff; reviewer training materials; audit tools used, including
completed copies of each record-level reviewer determination; all case-level critical
performance measure data elements used to determine a positive or negative event or
exclude a case from same; and inter-rater reliability testing procedures and results.
Documentation of results of statistical tests and any corrections or adjustments to data
along with justification for such changes.
Documentation of sources of any supporting external data or prior years’ data used in
reporting.
Policies to assign unique membership ID that allows all services to be properly related to
the specific appropriate enrollee, despite changes in status, periods of enrollment or
disenrollment, or changes across product lines (e.g., CHIP and Medicaid).
Procedures to identify, track, and link member enrollment by product line, product,
geographic area, age, sex, member months, and member years.
Procedures to track individual members through enrollment, disenrollment, and possible
re-enrollment.
Procedures to track members through changes in family status, changes in benefits or
managed care type (if they switch between Medicaid coverage and another product
within the same MCO).
Methods to define start and cessation of coverage.
Procedures to link member months to member age.
Description of software or programming languages used to query each database.
Description of software used to execute sampling sort of population files when sampling
(systematic) is used.
Member database.
Provider data (including facilities, labs, pharmacies, physicians, etc.).
Database record layout and data dictionary.
Survey data used for performance measures (See Protocol 5) Policies to maintain files
from which the samples are drawn in order to keep population intact in the event that a
sample must be re-drawn, or replacements made.
Computer source code or logic identifying specified sampling techniques, and
documentation that the logic matches the specifications set forth for each performance
measure, including sample size and exclusion methodology.
Methods used for sampling for measures calling for medical record or hybrid data.
Documentation assuring that sampling methodology treats all measures independently
and that there is no correlation between drawn samples.
Observation or documentation of procedures in which a biased sample was identified
and corrected.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

13

•
•
•
•
•
•
•
•
•
•

•
•
•
•
•
•
•
•
•

Documentation of “frozen” or archived files from which the samples were drawn, and if
applicable, documentation of the MCO’s process to re-draw a sample or obtain
necessary replacements.
For performance measures which are easily under-reported, procedures to capture data
that may reside outside the MCO’s data sets.
Procedures for mapping non-standard codes to standard coding.
Policies, procedures, and materials that evidence proper training, supervision, and
adequate tools for medical record abstraction tasks. (May include medical record
abstraction tools, training material, checks of inter-rater reliability, etc.)
Procedures for assuring that combinations of record-review data with administratively
determined data are consistent and verifiable.
Evidence that MCO’s use of codes to identify medical events were correctly evaluated
when classifying members for inclusion or exclusion in the numerator.
Evidence that MCO has counted each member and/or event only once.
Programming logic or demonstration that confirms that any non-standard codes used in
determining the numerator have been mapped to a standard coding scheme in a manner
that is consistent, complete, and reproducible.
Programming logic or source code that identifies the process for integrating
administrative and medical record data for numerator.
Procedures for properly executing complex medical algorithms, such as claimdependent events; events that require matching claims and pharmacy data; events that
require matching visit codes; and events that require accurately identifying and
computing multiple numerator events.
Procedures for displaying denominator counts, numerator counts, precision levels, sums
and cross-totals.
Procedures for reporting small sample sizes (to be consistent with required methodology
established by State).
Programming logic and/or source code for arithmetic calculation of each measure.
Review of reported measures to assess consistency of common elements (e.g.,
membership counts, number of pregnancies and births, etc.).
Programming logic and/or source code for measures with complex algorithms, to ensure
adequate matching and linkage among different types of data.
Documentation showing confidence intervals of calculations when sampling
methodology used.
Documentation showing calculation of levels of significance of changes.
Procedures for submitting reports that meet State requirements (e.g., specified
electronic format, supporting documentation, and timing).
Documentation that procedures for properly submitting required reports to State were
implemented appropriately.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

14

Worksheet 4: Data Integration and Control – Documentation Review Worksheet

Documentation

Reviewed

Not
Reviewed

Comments

Procedures and standards for all
aspects of the data repository(ies),
including building, maintaining,
managing, testing, and production
of performance measures
Manuals covering application
system development methodology,
database development and design,
and decision support system
utilization
Control system documentation
including flow charts and codes for
backups, recovery, archiving, and
other control functions
Procedures to consolidate
information from disparate
transaction files to support
performance measurement
Record and file formats and
descriptions, for entry, intermediate,
and repository files
Electronic formats and protocols
Electronic transmission procedures
documentation
Processes to extract information
from the repository to produce
intended result
Source code data entry, data
transfer, and data manipulation
programs and processes
Descriptive documentation for data
entry, data transfer, data
manipulation programs and
processes

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

15

Documentation

Reviewed

Not
Reviewed

Comments

If applicable, procedures for
coordinating activities of multiple
subcontractors in a way that
safeguards the integrity of the
performance measure data
Samples of data from repository
and transaction files to assess
accuracy and completeness of the
transfer process
Comparison of actual results from
file consolidation and data abstracts
to those which should have resulted
according to documented
algorithms
Documentation of data flow among
vendors to assess the extent to
which there has been proper
implementation of procedures for
coordinating activities to safeguard
the integrity of the performance
measure data
Documentation of data cutoff dates
Documentation of proper run
controls and of staff review of report
runs
Copies of files and databases used
for performance measure
calculation and reporting
Procedures governing production
process of plan-level performance
measures, including standards and
schedules
In the comments section, be sure to address the following:
•

Compare samples of data in the repository to transaction files. Are any members,
providers, or services lost in the process?

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

16

•

Is the required level of coding detail maintained (e.g., all significant digits, primary and
secondary diagnoses remain)?

•

If the MCO uses a performance measure repository, review the repository structure.
Does it contain all the key information necessary for performance measure reporting?

•

How does the MCO test the process used to create the performance measure reports?

•

Does the MCO use any algorithms to check the reasonableness of data integrated to
report the MCO-level performance measures?

•

Examine report production logs and run controls. Is there adequate documentation of
the performance measure report generation process? How are report generation
programs documented? Is there version control in place?

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

17

Worksheet 5: Interview Guide for Data Integration and Control MCO Personnel
Background Information
Name of MCO:
Date:
Location:
Year of First Medicaid Enrollment:
Year of First CHIP Enrollment:
Year of First MCO Performance Report (any product line):
EQRO Reviewers:

Names and Titles of Individuals Interviewed:
Has the MCO previously undergone validation of its State performance measure reporting
process? If so, when did the validation take place and who conducted it?
Other general issues:
Interview Questions
1.

2.

How is performance measure data collection accomplished:
•

By querying the applicable information system on-line?

•

By using extract files created for analytical purposes? If so, how frequently are
the files updated? How do they account for claim/ encounter for accuracy?

•

By using a separate relational database or data warehouse? If so, is this the
same system from which all other reporting is produced? Are reports created
from an NCQA-certified vendor software product? If so, how frequently are the
files updated? How are reports checked for accuracy?

Review the procedure(s) for consolidating claims/encounter, member, provider, and
other data necessary for performance reporting (whether it be into a relational database
or file extracts on a measure-by-measure basis).
•

How many different sources of data are merged together to create reports?

•

What control processes are in place to ensure that this merger is accurate and
complete?

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

18

3.

How does the MCO test the process used to create the performance measure reports?

4.

Does the MCO use any algorithms to check the reasonableness of data integrated to
report the MCO performance measures

5.

Are performance measurement reporting programs reviewed by supervisory staff?

6.

Is there an internal backup for performance measure programmers - do others know the
programming language and the structure of the actual programs? Is there
documentation?

7.

How does the MCO prevent loss of claim and encounter data when systems fail?

8.

What administrative data backup systems are in place?

9.

What types of authorization are required to be able to access claims/encounter, provider,
membership, and performance measure repository data?

10.

Describe documentation review and demonstrations provided:

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

19

Worksheet 6: Data Integration and Control Findings Worksheet
Accuracy of data transfers to assigned performance measure repository
Data Integration and Control Element

Met

Not Met

N/A

Comments

N/A

Comments

• MCO processes accurately and
completely transfer data from the
transaction files (e.g.,
membership, provider,
encounter/claims) into the
repository used to keep the data
until the calculations of the
performance measures have
been completed and validated
• Samples of data from repository
are complete and accurate
Accuracy of file consolidations, extracts, and derivations

Data Integration and Control Element

Met

Not Met

• MCO’s processes to consolidate
diversified files and to extract
required information from the
performance measure repository
are appropriate
• Actual results of file
consolidations or extracts were
consistent with those which
should have resulted according to
documented algorithms or
specifications.
• Procedures for coordinating the
activities of vendors ensure the
accurate, timely, and complete
integration of data into the
performance measure database
• Computer program reports or
documentation reflect vendor
coordination activities, and no
data necessary to performance
measure reporting are lost or
inappropriately modified during
transfer
EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

20

If the MCO uses one, the structure and format of the performance measure data repository
facilitates any required programming necessary to calculate and report required performance
measures.
Data Integration and Control Element

Met

Not Met

N/A

Comments

• The repository’s design, program
flow charts, and source codes
enable analyses and reports
• Proper linkage mechanisms have
been employed to join data from
all necessary sources (e.g.,
identifying a member with a given
disease/condition)

Assurance of effective management of report production and of the reporting software.
Data Integration and Control Element

Met

Not Met

N/A

Comments

• Documentation governing the
production process, including
MCO production activity logs, and
MCO staff review of report runs
was adequate
• Prescribed data cutoff dates were
followed
• The MCO has retained copies of
files or databases used for
performance measure reporting,
in the event that results need to
be reproduced
• Reporting software program is
properly documented with respect
to every aspect of the
performance measurement
reporting repository, including
building, maintaining, managing,
testing, and report production
• MCO’s processes and
documentation comply with the
MCO standards associated with
reporting program specifications,
code review, and testing

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

21

Worksheet 7: Data and Processes Used to Produce Performance Measures Documentation Review Checklist
Documentation
Policies which stipulate and enforce
documentation of data requirements,
issues, validation efforts and results
Procedures for displaying
denominator counts, numerator
counts, precision levels, sums, and
cross-totals
Procedures for reporting small
sample sizes (consistent with State’s
required methodology)
All reported measures to assess
consistency of common elements
(e.g., membership counts, number of
pregnancies and births, etc.)
For each measure:
Programming logic and/or source
code for arithmetic calculation
A project or measurement plan,
including work flow
Documentation of programming
specifications and data sources
Documentation of the original
universe of data including recordlevel patient identifiers that can be
used to validate entire programming
logic for creating denominators,
numerators, and samples
Documentation of computer queries,
programming logic, or source code
used to create denominators,
numerators, and interim data files
Documentation of medical record
review for each measure, as
appropriate, including: qualifications
of medical record review supervisor
and staff; reviewer training materials,
audit tools used (including completed
copies of each record-level reviewer
determination), all case-level critical
performance measure data elements
used to determine a positive or
negative event or exclude a case
EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

Reviewed

Not
Reviewed

Comments

22

Documentation
from same, and inter-rater reliability
testing procedures and results
Documentation of results of statistical
tests and any corrections or
adjustments to data along with
justification for such changes for
each measure, as appropriate
Documentation showing calculation
of levels of significance of changes
for each measure
Documentation (for each
performance measure, as
appropriate) showing confidence
intervals of calculations when
sampling methodology used
Documentation of sources of any
supporting external data or prior
years’ data used in reporting (for
each performance measure, as
appropriate)

Reviewed

Not
Reviewed

Comments

Describe Documentation Reviewed and Demonstrations Provided:
Questions:
1.

How are policies governing documentation of data requirements for performance
measurement, (e.g., data file and field definitions, mapping between standard and nonstandard codes) updated and enforced? Who is responsible for this?

2.

How are programming specifications for MCO performance measures documented?
Who is responsible for this?

3.

Are the documentation processes up to date?

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

23

Worksheet 8: Data and Processes Used to Produce Performance Measures - Findings
Worksheet
Measurement plans and policies which stipulate and enforce documentation of data
requirements, issues, validation efforts and results. These include:

Audit Element

•
•
•

Met

Not Met

N/A

Comments

Data file and field definitions
used for each measure
Maps to standard coding if
not used in original data
collection
Statistical testing of results
and any corrections or
adjustments made after
processing

Documentation of programming specifications (which may be either a schematic diagram or in
narrative form) for each measure includes at least the following:

Audit Element

•

Met

Not Met

N/A

Comments

All data sources, including
external data (whether from
a vendor, public registry, or
other outside source), and
any prior years’ data (if
applicable)

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

24

Audit Element

•

•

•

•

Met

Not Met

N/A

Comments

Detailed medical record
review methods and
practices, including the
qualifications of medical
record review supervisor
and staff, reviewer training
materials, audit tools used
(including completed copies
of each record-level
reviewer determination), all
case-level critical
performance measure data
elements used to determine
a positive or negative event
or exclude a case from
same, and inter-rater
reliability testing procedures
and results
Detailed computer queries,
programming logic, or
source code used to identify
the population or sample for
the denominator and/or
numerator
If sampling used, a
description of sampling
techniques and
documentation that assures
the reviewer that samples
used for baseline and repeat
measurements of the
performance measures were
chosen using the same
sampling frame and
methodology
Documentation of
calculation for changes in
performance from previous
periods (if applicable),
including statistical tests of
significance

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

25

Audit Element

•

•

•

Met

Not Met

N/A

Comments

Data that are related from
measure to measure are
consistent (e.g.,
membership counts,
provider totals, number of
pregnancies and births)
Appropriate statistical
functions are used to
determine confidence
intervals when sampling is
used in the measure
When determining
improvement in performance
between measurement
periods, appropriate
statistical methodology is
applied to determine levels
of significance of changes

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

26

Worksheet 9: Policies, Procedures, Data and Information Used to Produce Measures:
Review Checklist
Policies, Procedures, Data, Information
to be reviewed
Policies to assign unique membership
ID that allows all services to be
properly related to the specific
appropriate enrollee, despite changes
in status, periods of enrollment or
disenrollment, or changes across
product lines (e.g., Medicare and
Medicaid)
Procedures to identify, track, and link
member enrollment by product line,
product, geographic area, age, gender,
member months, member years
Procedures to track individual
members through enrollment,
disenrollment, and possible reenrollment
Procedures to track members through
changes in family status, changes in
employment or benefits or managed
care type (if they switch between
Medicaid coverage and another
product within the same MCO)
Methods to define start and cessation
of coverage
Procedures to link member months to
member age
Description of software or programming
languages used to query each
database
Programming logic and/or source code
for arithmetic calculation of each
measure.
Programming logic and/or source code
for measures with complex algorithms,
to ensure adequate matching and
linkage among different types of data
Member database

Reviewed

Not
Reviewed

Comments

Provider data (including facilities, labs,
pharmacies, physicians, etc.)
Database record layout and data
dictionary
EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

27

Policies, Procedures, Data, Information
to be reviewed
Survey data

Reviewed

Not
Reviewed

Comments

For performance measures which are
easily under-reported, procedures to
capture data that may reside outside
the MCO’s data sets
Procedures for mapping non-standard
codes to standard coding to ensure
consistency, completeness, and
reproducibility
Policies, procedures, and materials that
evidence proper training, supervision,
and adequate tools for medical record
abstraction tasks (may include medical
record abstraction tools, training
material, checks of inter-rater reliability,
etc.)
Procedures for assuring that
combinations of record-review data
with administratively determined data
are consistent and verifiable
MCO’s use of codes to identify medical
events were correctly evaluated when
classifying members for inclusion or
exclusion in the numerator
Evidence that MCO has counted each
member and/or event only once.
Programming logic or demonstration
that confirms that any non-standard
codes used in determining the
numerator have been mapped to a
standard coding scheme in a manner
that is consistent, complete, and
reproducible
Programming logic or source code that
identifies process for integrating
administrative and medical record data
for numerator
Programming logic and/or source code
for arithmetic calculation of each
measure.
Programming logic and/or source code
for measures with complex algorithms,
to ensure adequate matching and
linkage among different types of data

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

28

Describe documentation review and any demonstrations provided.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

29

Worksheet 10: Interview Guide for Assessing Processes Used to Produce
Denominators and Numerators
1. If any part of your network/data/membership was excluded from a performance
measure, how and why did you decide to exclude it?
2. Why did you select the reporting methodology (e.g., administrative, or hybrid) used to
create each of the measures (where there was an option)?
3. Did you use the State technical specifications as the specifications for the programmers,
or did your MCO write its own instructions/translations for the programmers?
4. Are there any manual processes used for calculating denominators and/or numerators?
Are manual processes used for sampling?
5. Are any measures calculated by vendors? If yes, are they checked for accuracy?
Please describe.
6. Do you have any concerns about the integrity of the information used to create any of
the measures? Please describe.
7. Do you know of any deviations from performance measure specifications that were
necessary because of data available or because of your MCO’s information system
capabilities?
8. Other issues.
9. Names and titles of persons interviewed:

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

30

Worksheet 11: Measure Validation Findings Worksheet
For each of the performance measures, all members of the relevant populations identified in
the performance measure specifications are included in the population from which the
denominator is produced.
Audit Element

Met

Not
Met

N/A

Comments

All members who were eligible
to receive the specified services
were included in the initial
population from which the final
denominator was produced.
This “at risk” population
included both members who
received the services, as well
as those who did not. This
same standard applies to
provider groups or other
relevant populations identified in
the specifications of each
performance measure.
Adequate programming logic or source code exists to appropriately identify all “relevant”
members of the specified denominator population for each of the performance measures.
Audit Element

Met

Not
Met

N/A

Comments

For each measure,
programming logic or source
code which identifies, tracks,
and links member enrollment
within and across product lines
(e.g., Medicare and Medicaid),
by age and sex, as well as
through possible periods of
enrollment and disenrollment,
has been appropriately applied
according to the specifications
of each performance measure.
Calculations of continuous
enrollment criteria were
correctly carried out and applied
to each measure (if applicable).
Proper mathematical operations
were used to determine patient
age or range.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

31

Audit Element

Met

Not
Met

N/A

Comments

The MCO can identify the
variable(s) that define the
member’s sex in every file or
algorithm needed to calculate
the performance measure
denominator, and the MCO can
explain what classification is
carried out if neither of the
required codes is present.
The MCO has correctly
calculated member months and
member years, if applicable to
the performance measure.
Completeness and accuracy of the codes used to identify medical events has been identified
and the codes have been appropriately applied.
Audit Element

Met

Not
Met

N/A

Comments

Not
Met

N/A

Comments

The MCO has properly
evaluated the completeness
and accuracy of any codes
used to identify medical events,
such as diagnoses, procedures,
or prescriptions, and these
codes have been appropriately
identified and applied as
specified in each performance
measure.
Specified time parameters are followed.
Audit Element

Met

Any time parameters required
by the specifications of the
performance measure are
followed (e.g., cut off dates for
data collection, counting 30
calendar days after discharge
from a hospital, etc.).

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

32

Exclusion criteria included in the performance measure specifications have been followed.

Audit Element

Met

Not
Met

N/A

Comments

Performance measure
specifications or definitions that
exclude members from a
denominator were followed. For
example, if a measure relates to
receipt of a specific service, the
denominator may need to be
adjusted to reflect instances in
which the patient refuses the
service or the service is
contraindicated.
Systems to estimate populations, which cannot be accurately counted, exist and are utilized
when appropriate.
Audit Element

Met

Not
Met

N/A

Comments

Systems or methods used by
the MCO to estimate
populations when they cannot
be accurately or completely
counted (e.g., newborns) are
valid.
All appropriate data are used to identify the entire at-risk population.
Audit Element

Met

Not Met

N/A

Comments

The MCO has used the
appropriate data, including
linked data from separate data
sets, to identify the entire at-risk
population.
The MCO has in place and
utilizes procedures to capture
data for those performance
indicators that could be easily
under-reported due to the
availability of services outside
the MCO.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

33

Qualifying medical events (such as diagnoses, procedures, prescriptions, etc.) are properly
identified and confirmed for inclusion in terms of time and services
Audit Element

Met

Not Met

N/A

Comments

The MCO’s use of codes used
to identify medical events are
complete, accurate, and
specific in correctly describing
what has transpired and when.
The MCO correctly evaluated
medical event codes when
classifying members for
inclusion or exclusion in the
numerator.
The MCO has avoided or
eliminated all double-counted
members or numerator events.
Any non-standard codes used
in determining the numerator
have been mapped to a
standard coding scheme in a
manner that is consistent,
complete, and reproducible as
evidenced by a review of the
programming logic or a
demonstration of the program.
Any time parameters required
by the specifications of the
performance measure are
adhered to (i.e., that the
measured event occurred
during the time period specified
or defined in the performance
measure).
Medical record data extracted for inclusion in the numerator are properly collected.
Audit Element

Met

Not Met

N/A

Comments

Medical record reviews and
abstractions have been carried
out in a manner that facilitates
the collection of complete,
accurate, and valid data.
Record review staff have been
properly trained and supervised
for the task.
EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

34

Audit Element

Met

Not Met

N/A

Comments

Record abstraction tools
require the appropriate notation
that the measured event
occurred.
Record abstraction tools
require notation of the results or
findings of the measured event
(if applicable).
Data included in the record
extract files are consistent with
data found in the medical
records as evidenced by a
review of a sample of medical
record for applicable
performance measures. (From
Medical Record Review
Validation Tools-Table 5,
ATTACHMENT XII)
The process of integrating
administrative data and medical
record data for the purpose of
determining the numerator is
consistent and valid.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

35

Worksheet 12: Medical Record Review Validation Tools
The purpose of medical record review (MRR) validation is to verify the accuracy of the MRR conducted
by each MCO. For each of at least two measures which included medical record review, the EQRO will
validate the medical records of 30 enrollees found to meet numerator requirements. In States with CHIP
programs that are separate from the Medicaid program and have their own EQR, the EQRO should
review 30 enrollees for Medicaid and 30 enrollees for CHIP. Only those members included in a hybrid
sample will be selected - the EQRO will not conduct medical record audits to validate administrative data.
For each measure in which medical record review was used, the EQRO will request a list of all of the
members in the MCO’s MRR sample. From that list, the EQRO will identify a sample of 30 members
who meet numerator requirements. MCOs will then be asked to provide access to or copies of medical
records so that the EQRO can verify that each member was appropriately included in the denominator
and received the required numerator service(s). In cases where there are fewer than 30 numerator
positives, the EQRO will review all records for that measure.
To provide sufficient time for each MCO to gather the required medical record documentation, the EQRO
may direct the MCOs to submit their lists of members in their hybrid sample twice - the first list as a
preliminary submission and the second list as a final submission. Submitting a first list prior to
completion of the MRR process would allow an MCO additional time to retrieve medical record
documentation. Soon after receipt of the first list, the EQRO will provide the MCO with the list of medical
records for which documentation must be submitted. Only a portion of the 30 medical records for the
validation sample will be included in the EQRO’s first sample request list. The remainder of the 30
records will be selected from the final list. While the first submission of MRR findings is optional, it is
recommended.
The EQRO would accept the first list submission approximately one month prior to the scheduled audit or
such other time as the EQRO shall specify. If an MCO chooses to submit a first list of medical records, it
must still submit a final listing sufficiently in advance of the scheduled audit as directed by the EQRO.
For each submission, MCOs will need to identify all members for whom MRR has been conducted and
indicate which members have been found to be numerator positives through MRR. The final list must
reflect the MCO’s final medical record review findings, with members for whom a medical record was
never found identified as not having met the numerator requirements.
No predetermined “passing” grade will be set for the medical record audit. Rather, onsite auditors will
use the MRR results to determine if the hybrid rate or solely MRR rate, as a whole, is biased, and to what
extent that bias affects the final reported rate for that measure. The EQRO will identify to the State what
effects bias, as well as incomplete data, will have on the MCO’s calculation of the performance measure.
For each of the evaluated measures auditors will determine the impact of the findings from the MRR
validation process on the MCO’s Final Audit Designation.
Step 1: Calculation of the Medical Record Review Error Rate
The EQRO will review up to 30 records identified by the MCO as meeting numerator requirements (as
determined through MRR) for the measures audited. Records are randomly selected from the entire
population of MRR numerator positives identified by the plan, as indicated on the MRR numerator listings
submitted to the EQRO. If fewer than 30 medical records are found to meet numerator requirements, all
records are reviewed. Administrative numerator positives are not included as part of this validation
process. The EQRO will calculate a MRR error rate for each performance measure calculated by the
hybrid method or solely from MRR as illustrated in Table 1, below:

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

36

TABLE 1: Summary of Medical Record Review (MRR) Reabstraction Findings:

Column A

Column B

Performanc
e Measure

Number of
MMR
Positives
Selected for
Audit

Column A:
Column B:

Column C:

Column D:
Column E:
Column F:

Column C
Number of
Medical
Records
Received

Column D
Number of
Medical Records
Found to be
Compliant

Column E
Accuracy
Rate (%)
(D/B)

Column F
Error Rate
(%) (100% E)

Name of performance measure evaluated.
Total number of MRR numerator positive records reabstracted by EQRO as part of the
medical record review validation process (i.e., 30, or the total population, if less than 30
MRR numerator positives were reported).
Total number of medical records submitted to EQRO, as part of the medical record review
validation process (i.e., should be equal to Column B or less than Column B if one or more
records were not submitted on time).
Total number of medical records reviewed by EQRO and identified as meeting numerator
requirements.
Accuracy rate - percent of records selected for audit that were identified as meeting
numerator requirements (Column D/Column B).
Error rate - percent of records selected for audit that were identified as not meeting
numerator requirements (100% - Column E).

Step 2: Determining the Potential Impact of MRR Reabstraction Findings on Final Audit Designations
The next step in MRR validation is to determine whether any medical record review errors significantly
biased the final reported rate for a given performance measure. To make this determination, the EQRO,
as directed by the State, should develop and follow decision rules such as the following:
Sample Decision Rules:
Error Rate of 10 Percent or Less: If the error rate (Table 1, column F) is 10 percent or less, then the
measure automatically passes the MRR validation. The Final Audit Designation is then determined
based on the auditors’ findings from the ISCA conducted as Pre-Onsite activity 3 and Onsite Activity 1.
As long as no errors leading to significant bias are discovered during the other components of the audit
process, the final rate is considered as having met the validation standards.
Error Rate of Greater than 10 Percent: If the error rate (Table 1, column F) is greater than 10 percent,
then the auditors determine the impact of the MRR validation findings on the final reported rate for the
measure. For each of the measures under review, auditors evaluate the impact of the MCO’s MRR
processes on its final reported rate by extrapolating the findings from the audited medical record sample
to the universe of all MRR positives. Details on this process are provided in Table 2.
EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

37

The maximum amount of bias allowed for the final rate to be considered reportable is “x” percentage
points (to be determined by each State).
•

•

If the amount of error in the MCO’s MRR process (Table 2, line 8) does not cause the final reported
rate to be biased by more than x percentage points, then the measure passes the MRR validation.
The compliance designation is then determined based solely on the auditors’ findings from the ISCA.
As long as no errors leading to significant bias are discovered during the other components of the
performance measure audit process, the final rate is considered valid.
If the amount of error in the MCO’s medical review process (Table 2, line 8) ultimately causes the
final reported rate to be biased by more than x percentage points, the rate is automatically
considered invalid. The performance measure is then designated as invalid.

TABLE 2: Impact of MRR Findings
Line
#

Description

1

Final Data Collection Method
Used (e.g., MRR, hybrid, etc.)
Error Rate (Percentage of
records selected for audit that
were identified as not meeting
numerator requirements, as
shown in Table 1, column F)
Is error rate < 10%? (Yes or
No)
--If yes, MCO passes MRR
validation; no further MRR
calculations are necessary
--If no, the rest of the
spreadsheet will be completed
to determine the impact on the
final rate
Denominator
(The total number of members
identified for the denominator of
this measure, as identified by
the MCO)
Weight of Each Medical Record
(Impact of each medical record
on the final overall rate;
determined by dividing 100% by
the denominator in line 4)
Total Number of MRR
Numerator Positives identified
by the MCO using MRR

2

3

4

5

6

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

Measure A

Measure B

Measure C

38

Line
#

Description

7

Expected Number of False
Positives
(Estimated number of medical
records inappropriately counted
as numerator positives;
determined by multiplying the
Error Rate in line 2 by line 6, the
total number of MRR numerator
positives reported)
Estimated Bias in Final Rate
(The amount of bias caused by
medical record review,
measured in percentage points;
determined by multiplying the
Expected Number of False
Positives in line 7 by line 5, the
Weight of Each Medical Record)

8

Measure A

Measure B

Measure C

If line 8 is x%, then the final rate is considered to be significantly biased. The measure will be
considered invalid.

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

39

Worksheet 13: Policies, Procedures, Data, and Information Used to Implement Sampling:
Review Checklist

Documents

Reviewed

Not
Reviewed

Comments

Description of software used to
execute sampling sort of
population files when sampling
(e.g., systematic) is used
Policies to maintain files from
which the samples are drawn in
order to keep population intact in
the event that a sample must be
re-drawn or replacements made
Computer source code or logic
identifying specified sampling
techniques, and documentation
that the logic matches the
specifications set forth for each
performance measure, including
sample size and exclusion
methodology
Methods used for sampling for
measures calling for hybrid data or
medical record review
Documentation assuring that
sampling methodology treats all
measures independently, and that
there is no correlation between
drawn samples
Observation of or documentation
of procedures in which a biased
sample was identified and
corrected
Documentation of “frozen” or
archived files from which the
samples were drawn, and if
applicable, documentation of the
MCO’s process to re-draw a
sample or obtain necessary
replacements
Describe Documentation Review and Demonstrations Provided:

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

40

Worksheet 14: Sampling Validation Findings Worksheet
The MCO has followed the specified sampling method to produce an unbiased sample which is
representative of the entire at-risk population.

Audit Element
•

•

•

•

•

•

Met

Not Met

N/A

Comments

Each relevant member or
provider had an equal
chance of being selected; no
one was systematically
excluded from the sampling.
The MCO / PIHP followed
the specifications set forth in
the performance measure
regarding the treatment of
sample exclusions and
replacements, and if any
activity took place involving
replacements of or
exclusions from the sample,
the MCO kept adequate
documentation of that
activity.
Each provider serving a
given number of enrollees
had the same probability of
being selected as any other
provider serving the same
number of enrollees.
The MCO examined its
sampled files for bias, and if
any bias was detected, the
MCO is able to provide
documentation that
describes any efforts taken
to correct it.
The sampling methodology
employed treated all
measures independently,
and there is no correlation
between drawn samples.
Relevant members or
providers who were not
included in the sample for
the baseline measurement
had the same chance of
being selected for the followup measurement as
providers who were included

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

41

Audit Element

Met

Not Met

N/A

Comments

in the baseline.

The MCO maintains its performance measurement population files/ data sets in a manner
which allows a sample to be re-drawn, or used as a source for replacement.
Audit Element
•

Met

Not Met

N/A

Comments

The MCO has policies and
procedures to maintain files
from which the samples are
drawn in order to keep the
population intact in the event
that a sample must be redrawn, or replacements made,
and documentation that the
original population is intact.

Sample sizes collected conform to the methodology set forth in the performance measure
specifications, and the sample is representative of the entire population.
Audit Element
•

•

•

Met

Not Met

N/A

Comments

Sample sizes meet the
requirements of the
performance measure
specifications.
The MCO has appropriately
handled the documentation and
reporting of the measure if the
requested sample size exceeds
the population size.
The MCO properly
oversampled in order to
accommodate potential
exclusions.

For performance measures which include medical record reviews (e.g., hybrid data collections
methodology), proper substitution methodology was followed.
Audit Element
•

•

Met

Not Met

N/A

Comments

Substitution applied only to
those members who met the
exclusion criteria specified in
the performance measure
definitions or requirements.
Substitutions were made for
properly excluded records and

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

42

Audit Element

Met

Not Met

N/A

Comments

the percentage of substituted
records was documented.

END OF ATTACHMENT

EQR Protocol 2 Attachment A
Performance Measure Validation Worksheets
September 2012

43


File Typeapplication/pdf
File Titleattachment A - Performance Measure Validation worksheets
SubjectEQR Protocol 2 - validation of performance measures reported by MCO
AuthorCMS
File Modified2012-10-11
File Created2012-10-11

© 2024 OMB.report | Privacy Policy