External Quality Review of Medicaid MCOs and Supporting Regulations in 42 CFR 438.360, 438.362, and 438.364

External Quality Review of Medicaid MCOs and Supporting Regulations in 42 CFR 438.360, 438.362, and 438.364 (CMS-R-305)

P3 Attachment A - PIP worksheet 5-22-12

External Quality Review of Medicaid MCOs and Supporting Regulations in 42 CFR 438.360, 438.362, and 438.364

OMB: 0938-0786

Document [pdf]
Download: pdf | pdf
EQR PROTOCOL 3 – Validation of Performance Improvement Projects (PIPs)
Attachment A: PIP Review Worksheet
PERFORMANCE IMPROVEMENT PROJECT VALIDATION WORKSHEET
Use this or a similar worksheet as a guide when validating MCO Performance Improvement
Projects. Answer all questions for each activity. Refer to the protocol for detailed information on
each area.

ID of evaluator:

Date of evaluation:

/

/

Demographic Information
MCO Name or ID:
Project Leader Name:
Telephone Number:
Name of Performance Improvement
Project:
Dates in Study Period:

Type of Delivery System (check all that
are applicable)

/

/

to

/

/

Staff Model
Network
Direct IPA
IPA Organization
MCI
PIHP
PCCM
Other
Number of Medicaid/CHIP Enrollees in MCO
Number of Medicaid/CHIP Enrollees in Study
Total Number of MCO Enrollees in Study

Number of MCO primary care physicians
Number of MCO specialty physicians
Number of physicians in study (if
applicable)

EQR Protocol 3 Attachment A
Validation of Performance Improvement Projects (PIPs)
December 2011

1

ACTIVITY 1: ASSESS THE STUDY METHODOLOGY
Step 1:

Review the Selected Study Topic(s)
Component/Standard

Y N N/A

Comments

Y N N/A

Comments

1.1. Was the topic selected through data
collection and analysis of
comprehensive aspects of specific MCO
enrollee needs, care, and services?
1.2. Is the PIP consistent with the
demographics and epidemiology of the
enrollees?
1.3. Did the PIP consider input from enrollees
with special health needs, especially those
with mental health and substance abuse
problems?
1.4. Did the PIP, over time, address a broad
spectrum of key aspects of enrollee care
and services (e.g., preventive, chronic,
acute, coordination of care, inpatient,
etc.)?
1.5. Did the PIP, over time, include all enrolled
populations (i.e., special health care
needs)?

Step 2:

Review the Study Question(s)
Component/Standard

2.1. Was/were the study question(s)
measurable and stated clearly in writing?

Step 3:

Review the Identified Study Populations
Component/Standard

Y N N/A

Comments

3.1. Did the study use objective, clearly defined,
measurable indicators (e.g., an event or
status that will be measured)?
3.2. Did the indicators track performance over a
specified period of time?
3.3. Are the number of indicators adequate
to answer the study question; appropriate
for the level of complexity of applicable
medical practice guidelines; and appropriate
to the availability of and resources to collect
necessary data?
EQR Protocol 3 Attachment A
Validation of Performance Improvement Projects (PIPs)
December 2011

2

Step 4:

Review Selected Study Indicator(s)
Component/Standard

Y N N/A

Comments

Y N N/A

Comments

4.1. Were the enrollees to whom the study
question and indicators are relevant clearly
defined?
4.2. If the entire population was studied, did its
data collection approach capture all
enrollees to whom the study question
applied?

Step 5:

Review Sampling Methods
Component/Standard

5.1. Did the sampling technique consider and
specify the true (or estimated) frequency of
occurrence of the event, the confidence
interval to be used, and the acceptable
margin of error?
5.2. Were valid sampling techniques employed
that protected against bias? Specify the
type of sampling or census used:
5.3. Did the sample contain a sufficient number
of enrollees?

Step 6:

Review Data Collection Procedures
Component/Standard

Y N N/A

Comments

6.1. Did the study design clearly specify the
data to be collected?
6.2. Did the study design clearly specify the
sources of data?
6.3. Did the study design specify a systematic
method of collecting valid and reliable data
that represents the entire population to
which the study’s indicators apply?
6.4. Did the instruments for data collection
provide for consistent and accurate data
collection over the time periods studied?
6.5. Did the study design prospectively specify
a data analysis plan?
6.6. Were qualified staff and personnel used to
collect the data?

EQR Protocol 3 Attachment A
Validation of Performance Improvement Projects (PIPs)
December 2011

3

Step 7:

Review Data Analysis and Interpretation of Study Results
Component/Standard

Y N N/A

Comments

Y N N/A

Comments

7.1. Were reasonable interventions undertaken
to address causes/barriers identified
through data analysis and QI processes
undertaken?
7.2 Are the interventions sufficient to be
expected to improve processes or
outcomes?
7.3 Are the interventions culturally and
linguistically appropriate?

Step 8:

Assess Improvement Strategies
Component/Standard

8.1. Was an analysis of the findings performed
according to the data analysis plan?
8.2. Were numerical PIP results and findings
accurately and clearly presented?
8.3. Did the analysis identify: initial and repeat
measurements, statistical significance,
factors that influence comparability of initial
and repeat measurements, and factors
that threaten internal and external validity?
8.4. Did the analysis of study data include an
interpretation of the extent to which its PIP
was successful and follow-up activities?

Step 9:

Assess Whether Improvement is “Real” Improvement
Component/Standard

Y N N/A

Comments

9.1. Was the same methodology as the
baseline measurement used when
measurement was repeated?
9.2. Was there any documented, quantitative
improvement in processes or outcomes of
care?
9.3. Does the reported improvement in
performance have “face” validity (i.e., does
the improvement in performance appear to
be the result of the planned quality
improvement intervention)?

EQR Protocol 3 Attachment A
Validation of Performance Improvement Projects (PIPs)
December 2011

4

Component/Standard

Y N N/A

Comments

Y N N/A

Comments

9.4. Is there any statistical evidence that any
observed performance improvement is true
improvement?

Step 10:

Assess Sustained Improvement
Component/Standard

10.1. Was sustained improvement
demonstrated through repeated
measurements over comparable time
periods?

ACTIVITY 2: VERIFYING STUDY FINDINGS (OPTIONAL)
1. Were the initial study findings verified upon
repeat measurement?

ACTIVITY 3: EVALUATE OVERALL VALIDITY AND RELIABILITY OF STUDY RESULTS:
SUMMARY OF AGGREGATE VALIDATION FINDINGS AND SUMMARY
Check one:
 High confidence in reported PIP results
 Confidence in reported PIP results
 Low confidence in reported PIP results
 Reported PIP results not credible
END OF DOCUMENT

EQR Protocol 3 Attachment A
Validation of Performance Improvement Projects (PIPs)
December 2011

5


File Typeapplication/pdf
AuthorMaria Goebert
File Modified2012-05-22
File Created2012-05-22

© 2024 OMB.report | Privacy Policy