CMS-10294.Supporting Statment Part B (8-23-10)

CMS-10294.Supporting Statment Part B (8-23-10).pdf

Program Evaluation of the Ninth Scope of Work Quality Improvement Organization Program (CMS-10294)

OMB: 0938-1104

Document [pdf]
Download: pdf | pdf
C. SUPPORTING STATEMENT PART B COLLECTION OF INFORMATION
EMPLOYING STATISTICAL METHODS
1.

Respondent Universe and Sampling Methods

a.

Respondent Universe

QIO Survey
The target population for the survey of QIOs includes all state QIO contracts. In total, there
are 53 state QIO contracts with a CMS contract with one organization for each state, and the
District of Columbia, and one contract each for Puerto Rico and the U.S. Virgin Islands. We will
survey all QIO directors and the theme/sub-theme leaders within QIOs for the following themes
and sub-themes: Patient Safety – Pressure Ulcers, Patient Safety – Physical Restraints, Patient
Safety – Surgical Care Improvement Project, Patient Safety – Methicillin-Resistant
Staphylococcus Aureus, Patient Safety – Drug Safety, Patient Safety – Nursing Homes in Need,
Prevention, Prevention – Disparities, Prevention – Chronic Kidney Disease, and Care
Transitions. Three of these themes are sub-national: Prevention – Disparities (6 QIOs),
Prevention – CKD (10 QIOs), and Care Transitions (14 QIOs).
Hospital and Nursing Home Surveys
For the surveys of hospitals and nursing homes, respectively, the target population consists
of providers (hospitals or nursing homes) certified to provide Medicare-Medicaid services. Those
providers are identified by being listed in the CMS Online Survey and Certification Reporting
System (OSCAR) database. The approximate population sizes are 4,500 hospitals and 16,000
nursing homes. 1 For each survey, the sampling frame will be constructed from the most recent

1

This number of hospitals includes roughly 950 Critical Access Hospitals (CAHs). If QIOs are working with
CAHs in the 9th SOW, CAHs will be included in the sample universe. If QIOS are not working with CAHs, they
will be excluded from the universe, and the sample universe of hospitals will be roughly 3,550.

33

CMS Provider of Services File, which is an extract file created from the OSCAR. Because some
variables for hospitals and for nursing homes differ, we will prepare two extracts of data: one for
the hospital population, and the other for the nursing home population. The extracts will include
variables needed for sample selection and the computation of the weights. The extracts will
exclude providers no longer in service, if any.
Since the QIO survey does not require sampling, in the subsequent sections we will describe
sampling methodology for the hospital and nursing home surveys only.
Qualitative Discussions with QIO Partner Organizations
The target population for the qualitative discussions with QIO partner organizations is
organizations that work with QIOs on the Care Transitions theme (14 states) and Prevention –
Chronic Kidney Disease (CKD) theme (10 states). The number of QIO partner organizations per
state is unable to be determined from existing data. For the Prevention – CKD theme, CMS staff
indicated that some individual states are working with over 40 organizations including statewide
chapters of national organizations (such as the National Kidney Foundation), local physician
organizations, and voluntary hospital organizations. For the Care Transitions Theme, we know
that across the 14 participating states the theme involves 70 hospitals, 277 skilled nursing
facilities, 316 home health agencies, and other providers, as well as other health care
organization collaborators that are not providers.
Case Studies
The target population for the case studies is 49 of the 53 state QIO contracts. As noted
above, the 53 state contracts include the 50 states and the District of Columbia, Puerto Rico, and
the Virgin Islands. We will exclude Puerto Rico, the Virgin Islands, and Alaska and Hawaii from
the target population of states due to resource constraints pertaining to travel costs. (In addition,

34

conducting interviews in Spanish in Puerto Rico would require a change in project staffing, and
the QIO contract for the Virgin Islands has unique aspects to it relative to the other QIO
contracts.)
b. Sampling Methods
Hospital and Nursing Home Surveys
The sample design for both the hospital and nursing home surveys has two objectives: (1) to
support survey estimates from national samples of hospitals and nursing homes, and (2) to
support a regression discontinuity (RD) research design for the Ninth SOW evaluation. Both the
hospital and nursing home samples will be stratified to facilitate the regression discontinuity
analysis. We will stratify each sampling frame primarily on the scores used in the determination
of “facilities targeted for improvement” in QIOs’ Ninth SOW contracts.
For the hospital survey, hospitals will be stratified by their Surgical Care Improvement
(SCIP) Appropriate Care Measure (ACM) score, which is the score variable used by CMS to
create its list of targeted providers (the “J-17” list 2 ), and will be used by Mathematica as the
forcing variable in the RD design. We will first form explicit strata based on the CMS
established targeting cut-off score. 3 We will group hospitals into four strata:
• Above the cut-off that defines whether hospitals were or were not on the CMS target
list for improvement (hospitals above the cut-off are targeted for improvement), and
far from the cut-off
• Above the cut-off, and near the cut-off
2

The list was published as attachment J-17 to the CMS request for proposals for QIOs.

3

CMS included hospitals on the target list if their ACM score was 30 points or more below the Achievable
Benchmarks of Care [ABC] threshold in both of the two most recent quarters (2006 Q4 and 2007 Q1). Effectively,
this means that the inclusion cut-off is based on whether a hospital’s best ACM score in those two quarters was
above or below that 30 point threshold. Consequently, we use hospitals’ best score in those two quarters as the score
variable in the RD analyses and as the basis for stratifying providers for sample selection.

35

• Below the cut-off, and near the cut-off
• Below the cut-off, and far from the cut-off
The hospitals in the two strata above and below the cut-off that are near the cut-off are the
set that will be used in the RD analysis (“within the RD bandwidth”). In order to support the RD
analyses, we will oversample providers near the cut-off (both near-above and near-below, that is,
those that are within the RD bandwidth). Sampled providers outside of this range will be used
only to calculate nationally representative descriptive statistics. Within these explicit strata, we
will use implicit stratification to improve the distributional characteristics of the samples based
on the following variables:
-

CMS regions (10 levels):
01 = Boston
02 = New York
03 = Philadelphia
04 = Atlanta
05 = Chicago
06 = Dallas
07 = Kansas City
08 = Denver
09 = San Francisco
10 = Seattle

-

Urban/rural (2 levels)

-

Type of control (3 levels):
01 = Non-profit
02 = Proprietary (for-profit)

36

03 = Government
-

Type/size of facility
01 = Critical Access Hospitals (CAHs)
02 = Smallest tercile of non-CAH hospitals in number of beds
03 = Middle tercile of non-CAH hospitals in number of beds
04 = Largest tercile of non-CAH hospitals in number of beds

In each explicit stratum, we will select an equal probability sample of hospitals. For implicit
stratification we will use Chromy’s (1979) probability-minimum-replacement procedure. This
method permits the deep, implicit stratification through sorting and avoids the potential bias that
may be associated with systematic samples. The Chromy procedure permits unbiased estimation
of the sampling variance.
Our initial sample of hospitals will be inflated to account for nonresponse; the target number
of hospitals in the responding sample is 1,250. Of these 1,250 respondents, we plan to sample
1,000 responding hospitals in the two strata close to the cut-off to maximize the sample available
for use in the RD analyses, while still sampling from all strata in order to allow for the
calculation of nationwide descriptive statistics. We will also sample more heavily from the
J17/Far stratum than from the No/Far stratum in order to elevate the number of providers that
work with QIOs (PPs) in the sample. Table C.1 displays the total number of providers within
each stratum, the number of proposed providers sampled, the sampling rates, and the expected
number of completed surveys. Because (a) there are only 606 providers in the J17/In strata (the
strata of targeted providers within the RD bandwidth) and (b) we expect a 70 percent response
rate, we can expect no more than 424 respondents from that stratum. Thus, within the RD
bandwidth we propose completing interviews for 424 providers below the cutoff and 576 above

37

it. The remainder of the respondent sample (250) will be used to complete the national sample
used to produce descriptive statistics.
TABLE C.1
COUNTS OF HOSPITALS IN EACH STRATUM AND ALLOCATION OF SURVEY SAMPLE
ACROSS STRATA

J17/Far
J17/Near
No/Near
No/Far

318
606
1,352
1,388

Proposed Number
of Hospitals
Sampled
131
606
823
225

Total

3,644

1,786

Strata

All Hospitals

Proposed Percent
of Hospitals
Sampled
41.2%
100.0%
60.9%
16.2%
48.7%

Expected Number
of Completes*
92
424
576
158
1,250

Note: “J-17” denotes facilities targeted by CMS for improvement, “No” denotes not on the J-17 target list,
“Near” denotes within RD bandwidth range (near the cutoff), and “Far” denotes outside of that
range.
*Response rates are assumed to be 70 percent.

The stratified sampling approach does introduce a need to use sampling weights in
calculating the descriptive statistics. The weights reduce the effective sample size and, in turn,
the precision of the estimates. For the subsample of hospitals that receive assistance from QIOs,
which is projected to be 25 percent of our hospital sample, the two-tailed, 95 percent confidence
interval for a binary outcome with a mean of 0.5 is ±0.065. The confidence interval will be
narrower for outcomes with means nearer to either 0 or 1, as well as for measures calculated for
the entire sample rather than for PPs only.
For the survey of nursing homes, we propose a sampling design based on the same
principles to support RD analyses of outcome for the pressure ulcer PPs. In January 2008, CMS
established cut-offs for inclusion of nursing homes on the J-17 targeting list of 20 percent or
more high-risk long-stay residents with pressure ulcers.. As with hospitals, we will use four
strata, defined by a combination of the J-17 targeting cut-offs and the RD bandwidth. For the RD
38

samples for each outcome, with an equal number on each side of the selection cutoff we need
900 responding nursing homes within the RD bandwidth to provide an adequate sample for
analysis. To obtain that sample, we will sample higher proportions of providers within the RD
bandwidth. Table C.2 shows the proposed number of nursing homes sampled, sampling rates,
and expected number of completed surveys. Providers in the “No/In” and “J17/In” rows will be
in the RD impact analysis sample.
TABLE C.2
COUNTS OF NURSING HOMES IN EACH STRATUM AND ALLOCATION OF SURVEY SAMPLE
ACROSS STRATA
All Nursing
Homes (NHs)
585
1,357
3,223
10,793

Strata
J17/Far
J17/Near
No/Near
No/Far

Proposed Number
of NHs Sampled
150
643
643
350

Proposed Percent
of NHs Sampled
25.6%
47.4%
20.0%
3.2%

Expected Number
of Completes*
105
450
450
245

Total
15,958
1,786
11.2%
1,250
Note: “J-17” denotes facilities targeted by CMS for improvement in reducing prevalence of pressure
ulcers, “No” denotes not on the J-17 target list, “Near” denotes within RD bandwidth range (near
the cutoff), and “Far” denotes outside of that range.

*Response rates are assumed to be 70 percent.For sample selection, we will use implicit
stratification using the following variables:
-

CMS regions (10 levels):
01 = Boston
02 = New York
03 = Philadelphia
04 = Atlanta
05 = Chicago
06 = Dallas

39

07 = Kansas City
08 = Denver
09 = San Francisco
10 = Seattle
-

Type of control (3 levels):
01 = Non-profit
02 = For profit
03 = Government

-

Number of Medicaid certified skilled nursing care beds in the facility (3
groups based on terciles of the distribution).

For the pressure ulcer nursing home PPs, anticipated to be 20.4 percent of our nursing home
sample, the confidence interval around a binary outcome with a mean of 0.5 is ±.085. Again,
confidence intervals will be narrower for binary responses with means that are closer to 0 or 1.
Changes to the Sampling Plan Originally Proposed for the Hospital and Nursing Home
Surveys
This sampling plan differs in two respects from the sampling plan originally submitted to
OMB. The first difference is a reduction in the sampling rate for hospitals in the “J17/In” strata
after accounting for sample nonresponse from 82.5% to 70.0%. That alteration was made in
order to make the rate consistent with the assumption that response rates will be 70%. Given that
assumption, the maximum sampling rate for any strata after accounting for nonresponse is 70%.
That reduction was accompanied by an increase in the sampling rate for the “J17/Out” strata to
maintain a total of 1,250 completed surveys, and a total RD analysis sample size of 1,000.
The second change in the sampling plan is to focus on one of the two nursing home
outcomes in order to ensure sufficient power for both the descriptive statistics and the RD
analyses undertaken.

40

Qualitative Discussions with QIO Partner Organizations
QIO partners who work in the areas of Patient Pathways (Care Transitions or CT) and
Prevention–Chronic Kidney Disease (CKD) will be targeted for discussions from November
2010 – February 2011. Up to 200 CKD and CT theme partners will be screened and/or
interviewed so researchers can understand their perception of the value of the QIO support to
achieving the theme objectives. Eight states will be selected randomly from among regional lists
of the states participating in each of these themes, 4 to assure regional variation as much as
possible given the specific states involved. For the Prevention – CKD theme, the QIOs serving
these states will be asked to identify all theme partners and their role in achieving theme
objectives and to specify a lead contact for each. Next, we will screen every partner listed by the
QIO for their level of engagement with the QIO and whether the QIO had any influence on their
activities. For those who indicated a significant level of engagement or QIO influence, we will
complete the full discussion protocol.
For the Care Transitions theme, we will also ask the QIO to identify all partnered
organizations. Based on the national numbers of participating provider organizations for this
theme, we anticipate needing to select up to 14 partner organizations from a longer list, to
achieve a mix of respondents who represent diverse partnered health care organizations that
together are involved with a relatively high proportion of transitions in care that are the target of
the interventions for this theme.
These qualitative discussions will be informative but are not meant to provide estimates for
their QIO partners.

4

The same state may be selected for both themes, so there may be fewer than 16 states in total.

41

Changes to the Plan Originally Proposed for the Qualitative Discussions with QIO Partner
Organizations
Due to unavoidable project schedule delays, it is no longer possible to conduct two rounds
of interviews one year apart as originally planned within the project period of performance.
Scheduling the second round of interviews less than one year after the first round would have run
the risk of not observing any changes in community partner perceptions. We discussed with
CMS our opinion that the benefit of conducting round 2 less than one year later would not be
worth the cost of doing so. To resolve this issue, we plan (with CMS concurrence) to conduct
one round of interviews, rather than two, during November 2010 – February 2011, and expand
the number of interviews per state as described above. This expansion will allow us to gain a
more complete understanding of the QIO’s role in influencing quality improvement in these
themes.
Also since the original draft, we have gained a better understanding of how these two
themes operate, and have obtained from CMS national figures on the number of providers in the
Care Transitions theme. This has allowed us to refine our plan for selecting respondents to take
into account the different types and number of organizations that the QIO is partnering with on
these two themes.
Case Studies
MPR will conduct twelve case studies of QIOs and related stakeholder organizations during
November 2010 through May 2011.
Selection of Case Study Sites. Although we want to pick 12 states that provide a good
representation of certain characteristics, the goal is not to draw a scientific sample from which to
estimate population parameters. The criteria for the 12 case studies are that they:

42

1. Include at least two states that are participating in the Prevention-Disparities theme,
one of which is New York, since that state has the only significant Hispanic
population receiving DSME training, and we want to ensure their experience is
represented in the evaluation
2. Include at least two states that are participating in the Prevention-CKD theme (which
may or may not overlap with number 1 above)
3. Include at least three states participating in the Care Transitions theme (which mayor
may not overlap with criteria 1 and 2 above)
4. Represent equally the four U.S. regions of Northeast, Midwest, South, and West
5. Represent variation in the size of the state Medicare populations
6. Include states that vary in the extent of their rurality (as measured by population
density)
7. Include states that vary in their budget per provider the QIO worked with
We will first select New York, to ensure representation of a Hispanic population in the
disparities theme. We will then divide the remaining 48 continental U.S. states (47 states plus the
District of Columbia) into 64 cells as defined by the four regions, state Medicare populations
(above or below the median Medicare population across states), participation status in any of the
three subnational themes (21 states participate in at least one theme and 28 participate in none),
population density (above or below the median), and QIO budget per participating provider
(above or below the median). Some of the cells will not be populated, however, QIOs have
commented (after a recent presentation of the evaluation plan) that they would like to have all
these variables considered to some extent in the case study selection process. Therefore, we will
randomly select populated cells without replacement (meaning once a cell has been used, we will
not use it again), and then draw a state from within each selected cell (if there are multiple states
in the cell). After selecting six states in this fashion, we will assess the mix of states for the
desired characteristics (especially participation in the subnational themes). If it appears from our
initial six selections that we may not fulfill the above criteria, we will revise the selection process

43

(drawing the next three states from cells in which states are participating in CKD, for example)
in order to meet the criteria.
Changes Proposed to the Original Plan for Selecting Case Study Sites. When we
presented our draft plan to a QIO executives and staff audience at a conference this fall, they
provided feedback that they would like to see criteria similar to the criteria above used to select
case study sites. We agreed this would work well and adjusted our plan for case study site
selection accordingly.
Selection of Providers within Case Study Sites. We will ask QIOs to provide lists of the
providers they worked with on each theme and sub-theme, and the evaluation team will select
and secure participation from organizations on the lists. 5 The exception is that we will not
specifically seek to interview providers related to the Care Transitions theme because they will
be included as part of the evaluations partners survey, described above. The steps in the process
are as follows:
1. Create one list for each provider type (hospitals, nursing homes, physician practices)
of providers who worked with the QIO on any theme or sub-theme, along with their
city/state locations.
2. Examine city/state locations relative to a map to identify the locations of participating
providers that are feasible to visit on a single visit and include geographic diversity.
Typically this would include selecting two cities within a half-day drive of one
another, with rural area between them. One of these cities would be near the location
of the QIO. Providers that are feasible to visit would include those within a 40-minute
drive from either of the two cities plus those in the rural area between them.
3. For each type of provider, array in a table the list of providers in geographically
feasible locations (per Step 2), indicating the theme/sub-theme(s) each worked on
with the QIO.
4. Use the tables to select:

5

MPR will have legal access to these names because CMS and the QIOs are executing contract modifications
to permit this to occur.

44

-

Three hospitals: Together, the three will include hospitals working on all
the patient safety sub-themes that involve hospitals.

-

Four nursing homes: one that worked with the QIO on Pressure Ulcers,
one that worked with the QIO on Physical Restraints, one that worked on
both Pressure Ulcers and Physical Restraints, and one that worked with the
QIO on the Nursing Home in Need sub-theme. Because there will only be
two nursing homes in need to select from, we may opt to talk with one of
these organizations by telephone if their locations are too geographically
dispersed to visit.

-

Two physician practices that worked with the QIO on the Prevention
and/or Prevention – CKD theme. In the two states that include a
Prevention–Disparities theme, two physician practices that worked with
the QIO on that theme.

Selection of Community Health Leaders. To get a vantage perspective from outside the
immediate provider and QIO stakeholders, we will seek to interview one key person representing
the hospital community (such as a knowledgeable hospital association representative), one
representing the nursing home community, and one representing the physician community (for
example, a state chapter head of a primary care physicians’ professional association such as the
Academy of Family Physicians). We will seek to identify these key contacts through the QIO
during the scheduling process.
2.

Procedures for the Collection of Information

QIO Survey
A self-administered web survey will be the primary data collection mode for the QIO
survey. CEOs or principal CMS contacts at the QIOs will be sent (1) an advance letter, printed
on CMS letterhead and signed by the CMS Project Officer, describing the survey and its two
components (the director component and the theme leader component) and requesting their help
to identify and provide contact information for the appropriate respondents, and (2) a project
description about the evaluation and the data collection components. (A copy of the letter is
included as Appendix C to this submission; Appendix D contains the project description.) We
45

will call any nonresponding CEOs/CMS principal contacts approximately five business days
after they should have received the letter, to encourage their response and answer any questions
about the study.
The QIO questionnaire (Appendix C) has been designed so that QIO directors and theme
leaders will be able to complete their respective parts of the survey in 20-30 and 45-60 minutes,
respectively. We expect that all respondents will complete the survey on the web, but we will
provide a paper-pencil version or allow them to complete it by phone for any who are unable to
complete it on the web. Approximately one week after the mailing of the advance letter, each
QIO director and all theme leaders (who were identified by the CEO or CMS principal contact)
will receive an email invitation to participate in the survey. The email invitation will contain a
hyperlink to the survey, with the individual’s username and password embedded. Following the
invitation email, a series of reminder emails will be sent to nonresponding QIO directors and
theme leaders.
MPR’s goal is to complete 53 QIO director surveys and 342 theme leader surveys for a
response rate of 100 percent and 85 percent, respectively. The QIO and theme leader surveys
will be administered over a three-month period beginning in August 2010. We will establish a
toll-free help desk and general email address at Mathematica for assistance with the surveys. The
advance letter will be mailed to the CEOs/CMS principal contacts just prior to the start of data
collection (in early August). The request for contact information for respondents (included in
Appendix C) will follow by email within a day of expected receipt of the advance letter. The
invitation email to the target respondents named by the CEOs/CMS principal contacts (included
in Appendix C) will be sent approximately two weeks later to the QIO directors and theme
leaders that the CEOs/CMS principal contacts identified. We will then follow up with reminder

46

emails to nonresponders (included in Appendix C) on a biweekly basis for the duration of the
data collection period. Data collection will continue through week 17 after initiation.
Changes to Originally Planned Procedures for QIO Survey, Based on Pre-Testing
Our pre-testing assisted us in developing a systematic process for identifying the most
appropriate respondents for the survey, beginning with the contact information available to us
from CMS, followed by that individual’s identification of the other appropriate respondents (as
now described above). We had previously assumed that the “QIO Director” would always be the
CMS contact, and that the theme leader names and contact information would be available from
CMS. Regarding the instrument itself, the pre-test respondents found the questions generally
clear and relevant. Based on their feedback, we made the following types of modifications to the
survey instrument, the revised version of which is provided in Appendix C:
QIO Director Survey
• Pre-test feedback indicated that responses to most questions would vary by theme.
Therefore, we moved most questions to the theme leader survey, keeping in this
survey only bigger-picture questions addressing challenges to quality improvement
and recommendations to improve the QIO Program’s success.
• Questions on subcontracts were removed from the survey entirely following pre-test.
They proved difficult for respondents to answer and we also learned that most of what
we were looking for is available from another data source.
Theme Leader Survey
• Based on pre-test feedback, we modified this survey to include more questions here
that were previously on the QIO director survey—those regarding the 9th SOW
contract, and communications with CMS.
• We deleted question 2 from the QIO director survey, after pre-test feedback that it
was confusing.
• Wording of some questions was clarified to respond to comments by pre-testers.

47

• We added one new question to the “Impact of External Factors on This Theme”
section, to reflect comments among pre-testers that the role the state agency plays is
important for us to understand.
• We separated the theme leader surveys for each theme, so that it would be easier
(fewer skip patterns) for anyone who needed to complete them in paper form rather
than on the web. (The items remain heavily overlapped on the theme surveys.)
• We deleted several sections from two of the theme surveys (Patient Safety - Nursing
Homes in Need, and Patient Safety – Drug Safety) based on pre-testing and additional
information about how those themes operate within QIOs (such as from presentations
at CMS’ QualityNet conference).

Hospital and Nursing Home Surveys
A CATI survey will be the primary data collection instrument for the hospital and nursing
home surveys. Respondents will be sent a letter, printed on CMS letterhead and signed by the
CMS Project Officer, describing the survey and informing them that an Mathematica interviewer
will be calling shortly to schedule a convenient time to conduct the interview. A copy of the
letter is included as Appendix E to this submission. The hospital and nursing home
questionnaires (Appendix E) have been designed so that hospital QI directors and nursing home
administrators (the likely respective respondents) will be able to complete it in 30 and 20
minutes, respectively.
Mathematica’s goal is to complete surveys with 1,250 hospitals and 1,250 nursing homes for
a 70 percent response rate (from a total sample of 1,785 hospitals and 1,785 nursing homes).
Table C.4 presents the number of sampled entities, targeted completes and response rates for the
QIO, theme leader, hospital and nursing home surveys. The surveys will be administered over a
four-month period beginning in May 2010. We will establish a toll-free line at MPR for
assistance with the surveys. The advance letter will be mailed to facilities just prior to the start of
data collection (in early May). MPR interviewers will call all facilities selected for inclusion in
the sample to set up appointments and conduct interviews. Reminder letters will be sent to all

48

nonresponding facilities at two points in time, roughly eight weeks into the data collection period
and again five weeks later. Data collection will continue through week 17.

TABLE C.3
EXPECTED SURVEY COMPLETES BY SURVEY TYPE
Number of
Sample
Points
Released

Number of
Responding Entities

Expected Response
Rate

QIO

53

53

100%

Theme Leader

402

342

85%

Hospital

1,785

1,250

70%

Nursing Home

1,785

1,250

70%

Survey

Case Study Discussions
QIO. The staff to be conducting the site visit will review the QIO’s survey and selected
documents from the Standard Data Processing System (SDPS). The review of SDPS documents
will assist us to understand the interventions of the QIO in advance of the visit and to estimate to
what extent the QIO is meeting its performance goals to date. This will allow the time on site
with the QIO to be focused primarily on obtaining an understanding of (1) any problems
encountered or negative survey responses; (2) suggestions for improvements to the program; (3)
the QIO staff’s lessons learned and knowledge about what activities and techniques worked best
to improve quality during the Ninth SOW; (4) the state quality environment, and how that may
have affected the QIO’s strategies and success; (5) the recruitment of providers and/or
beneficiaries, which the evaluation needs to understand to properly interpret quantitative analysis
results; and (6) remaining barriers to further improvement.

49

We will plan for five hours of on-site interviews at the QIO to cover the core topics, to allow
for a discussion of each theme and sub-theme as well as a broad-based discussion with the QIO
director. An additional 30 minutes per theme will be needed for the QIO portion of the site visit
at states that have taken on additional themes (Prevention–Disparities, Prevention–CKD, and/or
Care Transitions). Our officials who lead the themes at the national level suggested that this level
of detail is necessary. In particular, the Patient Safety theme must be understood in terms of the
experience with each of its components because they are so varied, encompassing different
provider settings (nursing homes, hospitals, others); different stages of understanding
(Methicillin-Resistant Staphylococcus Aureus (MRSA) and drug safety are new); and different
approaches.
Change to the Case Study QIO Discussion Plan. Because of the later timing of the QIO
survey relative to what was originally planned, it is no longer necessary to ask QIO case study
participants to update their survey responses prior to the visit, thus reducing the burden on them
for participating in the case study site visits.
Community Health Leaders. Community health leaders will be asked to provide their
observations and views about the QIO’s work, including its impact on health care, what activities
by the QIO had greater and lesser value in fostering improvements, the state quality environment
and how it affected QIOs’ work and success, their advice to make the QIO Program more
effective, and remaining barriers to further improvement in the state.
Providers that Worked with the QIO. Providers that worked with the QIO will tell us about
how they got involved in the initiative and whether and how the experience may have affected
their operations and quality of care. They will explain which QIO activities had greater and
lesser value for them and what lessons were learned as a result of the initiative with the QIO.
50

Then we will talk through the provider’s quality improvement story, to review with
representatives the provider’s performance trend on the measure(s) of interest; we will ask them
which actions at what times taken to attempt to improve on the measure they believe led to
performance improvements or failed to lead to measured improvements. These stories are
expected to yield insights for the evaluation into the role of the QIO, other factors both within
and outside the hospital, and how all of these factors played into the observed trends in
performance. In addition, we will discuss the state’s quality environment and remaining barriers
to improvement and will obtain the provider’s advice to CMS on how to improve the program.
(The invitation letters and case study discussion guides for the various types of respondent
groups are included in Appendix F).
Discussions with QIO Partner Organizations
The invitation letter, Prevention – CKD partner screener, and discussion guide are provided
in Appendix G. The process to identify the organizations to hold the discussions with will be
slightly different for the Prevention – CKD theme and the Care Transitions theme, although the
discussion protocol is similar.
For the CKD – Prevention theme, the lead contacts at the QIO partner organizations will be
contacted by email and/or phone first to screen them regarding their level of engagement with
the QIO during the Ninth SOW period, or any influence the QIO may have had on their
activities. Those who report having been significantly engaged with the QIO or who report
significant QIO influence (up to 8 per state) will be invited to participate in a 45-minute
telephone discussion at a time of their convenience.
For the Care Transitions theme, to help us identify organizations to hold discussions with we
plan to review the “Proportions of Transitions” table that each QIO has received from the QIO

51

Support Center for the Care Transitions theme. According to CMS, this table shows
quantitatively the extent of movement of patients from one facility to another among the
facilities participating in the Care Transitions theme project (such as from each hospital to
various nursing homes and home health agencies). After reviewing this table for each selected
state, we will identify a set of targeted respondents that represent a mix of hospitals, nursing
homes, home health agencies, and other providers that together account for a high proportion of
the transitions in care which are the target of the QIO intervention in this theme. If the QIO has a
major collaborating organization in its intervention, we will purposely include that organization
as well. In total, we will select up to 14 respondents per state.
For both themes, we will email the invitation to the identified contacts at least two weeks
prior to a target week for the interview and then follow the email with a phone call. A list of
topics for discussion will be sent to partners agreeing to be interviewed prior to the interview.
Partners who agree to participate will be informed that their responses are confidential and will
be reported in aggregate form only.
Prior to interviewing the partners, the QIOs’ quarterly reports to CMS relevant to the QIO
work with the partners will have been reviewed. The QIOs will be asked for an update of partner
activities since submission of the most recent quarterly report, for information to clarify
questions that may arise from reviewing the quarterly reports and about future plans for the
remaining time of the Ninth SOW.
3.

Methods to Maximize Response Rates

QIO Survey
Mathematica will utilize an initial advance mailing to alert QIOs about the survey. The letter
will be printed on CMS letterhead, personally addressed to the CEO or other principal contact
responsible for the QIO contract, and signed by the CMS Project Officer. Targeting the right
52

individuals is the first essential step to achieving a high response rate. Therefore Mathematica
will immediately follow the CMS advance letter with an email request for the names, emails and
phone numbers of the individuals with responsibility for each theme or patient safety sub-theme,
and for the best person to respond to the QIO Director questionnaire. To enable them to make
this decision, we state in the letter that the QIO Director questionnaire is intended for an
executive with ongoing management responsibility and knowledge of the QIO’s experience
operating the program under the 9th SOW contract. We also include the email address and tollfree telephone number of Martha Kovac, Mathematica’s survey director for the study, whom
they can contact for assistance.
QIOs will be motivated to respond to the survey due to the study sponsor, the relevant
subject matter, and the web-based survey mode, which provides an easy and convenient method
for response. In addition, we cite the requirement in their contract that they provide an
independent evaluator with data upon request, making their time spent responding chargeable to
their contract.
Roughly one week after the initial letter on CMS letterhead is mailed, a follow-up email and
from Mathematica will be sent to the QIO CEOs (see Appendix C). The purpose of this email is
to request contact information for the QIO Director and Theme Leader Surveys. An Excel
spreadsheet will be attached, to make completion of this task least burdensome as possible.
Phone calls will be placed to all executive respondents who have not returned their table with
appropriate respondent contact information with five to seven business days.
Once contact information is received from the QIOs 6 , invitation emails will be sent to the
identified individuals. Included will be a secure link to the applicable survey (QIO Director
6

Depending on the timeliness of receipt, we may need to send invitations on a rolling basis, as completed
contact information is received. However, our hope is that we can collect all contact information in a timely fashion
and do a single release of email invitations to the entire sample.

53

Survey or QIO Theme Leader Survey). The link will have the respondents unique ID and
password embedded One week later a second email will be sent to all QIO respondents to thank
those who completed the survey and to encourage those who haven’t responded to log on and do
so. Another round of reminder emails will be sent every one to two weeks until the end of the
data collection period. Reminder phone calls may be made to non-responders near the end of the
field period, if needed to achieve the desired response rates. These efforts are projected to yield a
response rate of 100 percent among QIO directors and 85 percent among theme leaders.
Hospital and Nursing Home Surveys
MPR will utilize an initial advance mailing to alert hospital and nursing home facilities
about the survey. The letter will be printed on CMS letterhead, personally addressed, and signed
by the CMS Project Officer. It will include a toll-free telephone number at MPR to call for
information or to set up an appointment, as well as a fact sheet with answers to commonly asked
questions. Hospitals and nursing homes will be motivated to respond to the survey due to the
study sponsor, the relevant subject matter, and a survey mode that allows them to schedule an
appointment to complete the survey at their convenience.
MPR interviewers will actively call all sampled facilities to conduct interviews or to set
appointments for conducting interviews. Interviewers will dial at all times of the day and will
leave messages with facility support staff that include the toll-free number if a respondent is not
available. Roughly eight weeks after the initial letter is mailed, a second letter will be sent to all
nonresponding facilities to encourage them to set up survey appointments. A final round of
letters will be sent to all remaining nonresponding facilities roughly 13 weeks after the initial
letter.
These efforts are projected to yield a 70 percent response rate. This relatively low response
rate increases the potential for nonresponse bias, but does not imply that survey estimates will
necessarily exhibit bias. To reduce any potential bias, the data for the responding institutions will
be weighted to reflect the distribution of the study populations (as represented on the sampling
frames), of the same characteristics used for stratification. Two sets of weights will be prepared,
54

one for use only in the RD analysis and one for national estimates of characteristics of studyeligible hospitals and nursing homes.
Our non-response analysis will include three steps: (1) comparisons of response rates across
subgroups, (2) comparisons of responding institutions on population characteristics (obtained
from the sampling frame) and an (3) examination of whether the variables available from the
external source (the frame) are correlated with study variables. The frame variables used will be
those used for stratification. This examination will be conducted first using tabulations of various
different response rates for key subgroups of providers by “baseline” characteristics (such as
region of country, facility ownership, bed size, and so on) and comparing the weighted
distributions of respondents and nonrespondents for baseline characteristics.
If the results of these initial analyses suggest that further exploration is necessary, potential
additional analyses include identifying the characteristics that best predict nonresponse through
techniques to detect interactions and through regression modeling (“response propensity
modeling”), using this information to generate nonresponse weight adjustments, and comparing
the distributions of respondents using the fully response-adjusted analysis weights for baseline
characteristics to the distributions for the full sample comparably weighted using the unadjusted
sampling weights. These analyses can highlight situations in which the potential for nonresponse
bias is greatest and where greater caution should be exercised in the interpretation of the
observed findings.
Case Study Discussions
Mathematica will take a number of steps to gain organizations’ participation in the in-person
and telephone case study discussions. The site visit scheduling process will begin 15 weeks prior
to the target week for the visit and proceed as follows:

55

•

Fifteen weeks prior: A letter will be sent to the QIO describing our plans to visit and
requesting (1) key staff to set aside time for the discussions during 8 a.m.–1 p.m.
Monday of the target week; (2) identification of the providers the QIO has been
working with on each theme, along with their location and key contact, by 11 weeks
prior to the visit; (3) identification of community health leaders by 11 weeks prior to
the visit; and (4) a request that they review and update the QIO survey and return it to
us by three weeks prior to the target week. The provision in their contract requiring
their cooperation with an evaluation will be cited.

• Nine to eleven weeks prior: Mathematica will select providers for the visit, based on
the QIO lists of providers they have worked with and community health leaders.
Mathematica may add to or substitute other community health leaders for the ones the
QIO provides, based on our understanding of the state’s health system.
• Nine weeks prior: Invitation letters are sent to providers targeted for a site visit. A
letter of encouragement to participate and assurance of confidentiality signed by a
CMS official will be attached.
• Two to eight weeks prior: Follow up to the letters occurs on a regular schedule of
repeated telephone and/or email contacts beginning a week after the initial letter was
sent, to gain agreement by the targeted providers and community health leaders and to
schedule the specific time, date, and place of the meeting. Confirmation letters are
emailed within three days after agreement on a date and time for the interview.
Background information for the site visit is gathered and logistics are arranged
(flights, hotels, rental car, maps, and directions).
• One week prior: All scheduled interviews are confirmed by telephone and/or email.
The junior site visitor provides the senior site visitor with the final site visit materials,
including background information on each organization to be visited, the QIO survey,
the interview guides, and all logistics information.
4.

Tests of Procedures or Methods

QIO Survey
We conducted a pretest of the QIO and theme leader surveys with a convenience sample of
three QIOs and three or less theme leaders from each QIO for a total of nine theme leaders. We
made initial calls to solicit cooperation, emailed respondents a log in number and password, and
asked them to complete the web survey and record the time it took them to complete it. We then
held a debriefing telephone call to obtain the survey length, assess their understanding of the
questions, and identify any areas of confusion or navigational problems they had in completing
the survey.
56

Hospital and Nursing Home Surveys
We conducted a pretest of the hospital and nursing home surveys with a convenience sample
of nine hospitals and nine nursing homes. We made initial calls to solicit cooperation, set up
appointments to conduct the survey, and completed the survey with respondents by telephone
while keeping careful record of the start and end time of each survey. We held a debriefing at the
conclusion of the survey to assess their understanding of the questions, and identify any areas of
confusion they had in answering the survey questions.
Discussions with QIO Partner Organizations and Case Study Discussions Protocol
The discussion guides developed for the QIO evaluation were developed by researchers
familiar with the QIO program and with experience interviewing the types of respondents
relevant here. The guides were designed to meet CMS objectives for the information to be
gathered based on the evaluation statement of work. The discussion guides are not structured
instruments, rather they provide a framework for discussion, where the researcher is using active
listening techniques to ensure the respondent is understanding the question at hand in the context
of the unique local circumstances, and is enlightening the research on this topic. The timeframes
for the discussions are set and respected by the researcher. In addition, the QIO executives/staff
discussion guide is largely based on the concept of following up QIO survey items to better
understand the QIO’s responses.
5.

People Involved in Design
The following people have contributed to the study design and to the design of the survey

instruments, discussion guides, and site visit protocols:
• Dr. Myles Maxfield, an MPR senior health researcher and study project director,
(202) 484-4682

57

• Ms. Sue Felt-Lisk, an MPR senior health researcher and study co-principal
investigator, (202) 484-4519
• Dr. Arnold Chen, an MPR senior health researcher and study co-principal
investigator, (609) 275-2336
• Ms. Martha Kovac, an MPR associate director of survey research and study survey
director, (609) 275-2331
• Dr. Andrew Clarkwest, an MPR health researcher, (202) 250-3501
• Dr. Kirsten Barrett, an MPR survey researcher, (202) 554-7564
• Claudia Schur, a Social and Scientific Solutions (SSS) CHRP deputy, (301) 628-3001
• Kathryn Anne Paez, a SSS researcher, (301) 628-3001
• Therese Moore, an Abt principal associate and study director of nursing home care,
(617) 492-7100

58

REFERENCES

Agency for Healthcare Research and Quality, 2006 National Healthcare Quality Report
(Rockville,
MD:
AHRQ;
December
2006).
Available
at:
http://www.ahrq.gov/qual/nhqr06/nhqr06report.pdf
Bradley, E.H. et al. “From Adversary to Partner: Have Quality Improvement Organizations
Made the Transition?” HSR, vol. 40, no. 2, pp. 459–476.
Chen, Arnold, Andrew Clarkwest, Sarah Croake, Sue Felt-Lisk, Kate Stewart, and Christianna
Williams. “Program Evaluation of the 9th Scope of Work QIO Program: Evaluation
Methodology, Conceptual Framework, and State Specific Provider Environment Task.”
Princeton, NJ: Mathematica Policy Research. April 29, 2010.
Chromy, J.R. “Sequential Sample Selection Methods.” Proceedings of the Survey Research
Methods Section of the American Statistical Association, 1979, pp. 401-406.
Clarkwest, A., S. Felt-Lisk, S. Croake, and A. Chen. “Assessment of the Eighth Scope of Work
of the Medicare Quality Improvement Organization Program: Final Report.” Washington,
DC: Mathematica Policy Research, April 14, 2009.
Institute of Medicine. “Medicare’s Quality Improvement Organization Program: Maximizing
Potential.” Washington, DC: IOM, March 2006.
Jencks et al., “Change in the Quality of Care Delivered to Medicare Beneficiaries, 1998–1999 to
2000–2001.” JAMA, vol. 289, no. 3, pp. 305–312.
Rollow, W., Lied, T.R., McGann, P., Poyer, J., LaVoie, L., Kambic, R.T., Bratzler, D.W., Ma,
A., Huff, E.D., and Ramuno, L.D. “Assessment of the Medicare Quality Improvement
Organization Program.” Annals of Internal Medicine, vol. 145, no. 5, Sept 5, 2006, pp. 342.
Snyder, C., Anderson, G. “Do Quality Improvement Organizations Improve the Quality of
Hospital Care for Medicare Beneficiaries?” JAMA, vol. 293, 2005, pp. 2900–2907.

59


File Typeapplication/pdf
File TitleProgram Evaluation of the Eighth and Ninth Scope of Work Quality Improvement Program: Supporting Statement for Paperwork Reducti
AuthorMartha Kovac, Sue Felt-Lisk, Arnold Chen, John Hall
File Modified2010-08-27
File Created2010-08-11

© 2024 OMB.report | Privacy Policy