Download:
pdf |
pdfSupporting Statement for the National Implementation of the Hospital CAHPS Survey
Supporting Statement – Part A
A 1.0 CIRCUMSTANCES OF INFORMATION COLLECTION
A 1.1 Background
The intent of the HCAHPS initiative is to provide a standardized survey instrument and data
collection methodology for measuring patients‟ perspectives on hospital care. While many hospitals
currently collect information on patients‟ satisfaction with care, there is no national standard for
collecting or publicly reporting this information that would enable valid comparisons to be made
across all hospitals. In order to make “apples to apples” comparisons to support consumer choice, it
is necessary to introduce a standard measurement approach. HCAHPS can be viewed as a core set of
questions that can be combined with a broader, customized set of hospital-specific items. HCAHPS
is meant to complement the data hospitals currently collect to support improvements in internal
customer services and quality related activities.
Hospitals began using HCAHPS, also known as Hospital CAHPS or the CAHPS Hospital Survey,
under the auspices of the Hospital Quality Alliance, a private/public partnership that includes the
American Hospital Association, the Federation of American Hospitals, and the Association of
American Medical Colleges, Joint Commission on Accreditation of Healthcare Organizations,
National Quality Forum, AARP, and CMS/AHRQ, and other stakeholders who share a common
interest in reporting on hospital quality. This Alliance has been proactive in making performance
data on hospitals accessible to the public thereby improving care. In March 2008, the first results
1
from the HCAHPS survey were publicly reported on the Hospital Compare website, which can be
found at www.hospitalcompare.hhs.gov, or through a link on www.medicare.gov
CMS is sensitive to the costs that will be borne by the hospitals that participate in the HCAHPS
initiative. In order to gain a full and detailed understanding of the range of costs associated with
implementation of HCAHPS, in 2005 CMS commissioned Abt Associates, Inc. to conduct a
thorough investigation of the costs and benefits of HCAHPS.
Costs associated with collecting HCAHPS will vary depending on:
o
The method hospitals currently use to collect patient survey data;
o
The number of patients surveyed (target is 300 completed surveys per year); and
o
Whether it is possible to incorporate HCAHPS into their existing survey.
Abt Associates‟ cost estimates are for data collection and transmission to CMS only and do not
include administrative, information technology, or other costs that hospitals may incur as a result of
HCAHPS. Abt estimates that the average data collection cost for a hospital conducting the 27-item
version of HCAHPS as a stand-alone survey would be between $3,300 - $4,575 per year, assuming
that 80-85 percent of hospitals collect HCAHPS by mail and the remainder by phone or active IVR.
The costs for combining HCAHPS with existing surveys would be considerably less. It would cost
$978 per hospital annually to incorporate the 27-item version of HCAHPS into existing surveys.
Abt estimates that the nationwide data collection cost for conducting the 27-item version of HCAHPS
would be between $4.1 million and $19.1 million per year if all eligible hospitals participated,
depending upon the extent to which hospitals integrate HCAHPS with their existing survey activities
2
or conduct it as a stand-alone survey. In the context of overall hospital costs, the maximum national
cost estimate of HCAHPS represents less than 0.02 percent of overall hospital costs, even if all
hospitals collect HCAHPS as a stand-alone survey.
While the potential benefits of HCAHPS cannot be enumerated as precisely as its costs, Abt
concluded that:
“What we can conclude with some level of confidence is that the marginal costs
associated with a longer version of HCAHPS are likely to be relatively small, so if
there is a reasonable basis for believing that the 27-item version of HCAHPS offers better
information to consumers than a shorter alternative, then there are good reasons for
implementing the current 27-item version of HCAHPS.”
The Executive Summary and Final Report cost-benefit analysis, “Costs and Benefits of HCAHPS”,
may be found at www.cms.gov/HospitalQualityInits in the section called “HCAHPS: Patients’
Perspectives of Care Survey.”
In spring of 2002, the Centers for Medicare & Medicaid Services (CMS), requested that the Agency
for Healthcare Research and Quality (AHRQ), through the CAHPS team, develop and test a survey
through which hospital patients could assess the care they receive. This process has been consistent
with other CAHPS survey development processes, including: review of the relevant literature; review
of existing hospital surveys obtained through a public call for measures; pilot testing; and a series of
opportunities for public input.
Three broad goals have shaped HCAHPS. First, the survey is designed to produce comparable data
on the patient‟s perspective on care that allows objective and meaningful comparisons between
hospitals on domains that are important to consumers. Second, public reporting of the survey results
3
is designed to create incentives for hospitals to improve their quality of care. Third, public reporting
will serve to enhance public accountability in health care by increasing the transparency of the quality
of hospital care provided in return for the public investment. With these goals in mind, the HCAHPS
project has taken substantial steps to assure that the survey will be credible, useful, and practical.
This methodology and the information it generates are available to the public.
CMS partnered with AHRQ to develop a standard instrument and data collection and reporting
procedures that will capture patients‟ perspectives of their hospital care. AHRQ is a leader in
developing instruments for measuring patient perspectives on care. AHRQ and its grantees
developed the Health Plans CAHPS Survey, which is currently used to assess the care provided by
health plans covering over 132 million Americans. While CAHPS has been accepted as the industry
standard for measuring consumers’ perspectives in the healthcare system, it does not address patients’
perspectives in the acute care setting. In response to this need, AHRQ initiated the process of
developing a public domain survey instrument for the acute care setting. AHRQ used its experience
and expertise to work with CMS to develop both a standard survey for measuring patients‟ reports
and ratings of their care in the hospital setting, and approaches to reporting the results to consumers.
Steps in creating and implementing the survey are summarized in Table 1.
4
Table 1: HOSPITAL CAHPS SURVEY® Development and Implementation
Timeframe
Published a “call for measures” in the Federal Register and received 7 submissions
from Avatar, Edge Health Care Research Healthcare Financial Management
Association, Jackson Organization, Press Ganey Associates, National Research
Corporation, Peace Health, Professional Research Consultants, and SSM Health Care.
Completed literature review.
July 2002
Sept-Nov 2002
Held a Web chat to answer questions about HCAHPS.
Oct 2002
Provided draft domains to CMS.
Oct 2002
Reviewed measures submitted in response to Federal Register Notice (FRN).
Nov 2002
Held Stakeholders Meeting to solicit suggestions and comments.
Nov 2002
Held vendors meeting to solicit suggestions and comments.
Nov 2002
AHRQ delivered 66 item draft survey to CMS for use in pilot test.
Jan 2003
Developed data collection and sampling methods, and developed analysis plan.
Jan-Feb 2003
Published FRN soliciting comments on draft HCAHPS.
Feb 2003
Completed hospital recruitment for pilot.
Mar 2003
Began data collection for CMS 3-state pilot test.
June 2003
Published a FRN soliciting comments on draft HCAHPS and asked for input about
implementation issues.
June 2003
Analyzed data from CMS pilot test.
Sept–Nov 2003
Review of instrument by CAHPS Cultural Comparability team.
Fall 2003
Began CT pilot test of the HCAHPS instrument.
Fall 2003
Held HCAHPS Stakeholders‟ Meeting at AHRQ.
Nov 2003
Revised HCAHPS instrument to 32 items.
Nov 2003
AHRQ submitted revised 32-items HCAHPS Instrument to CMS.
Dec 2003
Published a FRN soliciting input for 32-item HCAHPS instrument and implementation
strategy.
Started coordination of national implementation with HSAG, the AZ QIO.
Dec 2003-Feb 2004
January 2004
Completed CT pilot test of HCAHPS.
Jan 2004
AHRQ submitted Analysis Report of the CMS 3-state pilot to CMS.
Jan 2004
Continued discussions with hospitals, vendors, consumers to follow-up on FRN
comments from February.
March – Sept 2004
Revised 25-item HCAHPS Instrument submitted by AHRQ to CMS.
Oct 2004
Submitted HCAHPS to NQF for its consensus development process.
November 2004
Started developing training documents for national implementation.
December 2004
Started discussions regarding data transmission via QNET & other issues with the
IFMC, the IA QIO.
April 2005
Formed the Data Integrity Group.
June 2005
5
Table 1: HOSPITAL CAHPS SURVEY® Development, cont.
Timeframe
Received endorsement for 27-item HCAHPS from the National Quality Forum.
May 2005
Modified survey instrument and protocol as 27-items.
May 2005
Abt Associates, Inc. receives OMB approval for cost-benefit analysis.
June 2005
Established the Data Submission and Reporting Group.
July 2005
Abt Associates, Inc. submits final report of the cost-benefit analysis.
October 2005
Published FRN soliciting comments on draft CAHPS Hospital Survey.
November 2005
Received final approval.
December 2005
Mode Experiment.
National Implementation begins.
HCAHPS participation linked to RHQDAPU program (“pay for reporting”).
First public reporting of HCAHPS results.
February – May 2006
October 2006
July 2007
March 2008
Throughout the HCAHPS development process, CMS has solicited and received a great deal of
public input. As a result, the HCAHPS questionnaire and methodology have gone through several
iterations. The first was a 66-item version that was tested in a three-state pilot study (this was
developed for testing purposes; we never intended that the final version be this long). Prior to the start
of the pilot test, a Federal Register notice was published in February 2003 soliciting input on the
proposed pilot study. This notice produced nearly 150 comments. Based on results of the pilot study,
the questionnaire was reduced to 32 items. CMS received additional feedback from a Federal
Register Notice published in June 2003 that sought further comment on the survey and
implementation issues while the initial pilot testing was underway. CMS received 110 responses to
the notice from hospital associations, provider groups, consumers/purchasers, and hospital survey
vendors.
A 32-item version of the HCAHPS instrument was published in the Federal Register in December
2003 for public comment; CMS received nearly 600 comments that focused on the following topics:
6
sampling, response rate, implementation procedures, cost issues, the length of the instrument,
exclusion categories, question wording, and reporting.
CMS, AHRQ, the CAHPS grantees and members of the Instrument, Analysis, and Cultural
Comparability teams met via conference calls two to three times per week to review all comments
from the three Federal Register Notices and to modify the survey instrument and implementation
strategy based on the comments. The input we received conflicted. Survey vendors and many
hospitals indicated that the 32-item version was too long, while consumer groups indicated that the
full content was needed to support consumer choice. After the comments were reviewed, CMS and
AHRQ held additional discussions with hospitals, vendors and consumers to discuss the comments
received.
Using the comments received from the public and stakeholders, and the psychometric analysis of the
data from the 3-state pilot study, CMS reduced the next version of the HCAHPS questionnaire to 25
items. The following questions were eliminated from the longer versions of the questionnaire: overall
rating of the doctor; overall rating of the nurse; how often did doctors treat you with courtesy and
respect; how often did nurses treat you with courtesy and respect; overall mental health status; and
two items related to whether and how a proxy helped complete the survey. The item regarding how
often hospital staff ask if you were allergic to any medicine was also eliminated from the survey. A
question from the original 66-item survey was used to replace the allergy question. This newly reintroduced item asks, “Before giving you the medicine, how often did hospital staff tell you what the
medicine was for?” (Question 14 on the 25-item survey) in response to public input, we have also
eliminated the reference to doctors in the screener question that identifies patients who needed help
getting to the bathroom or using a bedpan (Question 8 on the 25-item survey). The questions on
7
overall mental health status and the two proxy questions were dropped because their impact in the
patient-mix adjustment was negligible. Questions about being shown courtesy and respect by
doctors and nurses were pulled from the latest version because input received indicated that the two
items on doctors or nurses “explaining things fully” and “listening carefully” were synonymous with
showing courtesy and respect. Taking these questions out of these composites did not adversely
impact the psychometric properties of the doctor and nurse communication composites. The allergy
question originally had a scale of “never to always‟ that didn‟t work well because the question is
usually asked once upon admission. Changing the scale to a “yes/no” response provided very little
variation across hospitals.
As the HCAHPS survey and implementation procedures evolved, CMS and AHRQ worked to
develop procedures to allow for as much flexibility as possible to minimize disruption to current
survey activities to the extent possible. For instance, prior to developing an implementation strategy
CMS solicited input through the June 2003 Federal Register notice requesting feedback on mode of
survey administration, periodicity of administration, and specific criteria for inclusion in the sampling
frame. Using this feedback, CMS developed an implementation strategy that has been modified as a
result of additional input received.
The basic implementation procedures were set up to provide flexibility to survey vendors and
hospitals. For example, following training, approved hospitals or vendors can administer the Hospital
CAHPS survey either as a (a) stand-alone survey or (b) integrated with the hospital‟s existing survey.
The survey will be conducted continuously throughout the year to make it easier to integrate with
existing survey activities. Because vendors and hospitals currently use multiple modes to administer
their internal hospital surveys, multiple survey modes are allowed. The hospital /vendors may use
8
any of the following survey modes: telephone only, mail only, a mixed methodology of mail with
telephone follow-up, or active interactive voice response (IVR). All modes of administration must
follow a standardized protocol. Since different modes of administration may affect how patients
respond, CMS conducted a large-scale mode experiment in 2006 to determine appropriate
adjustments to the data for public reporting.
In addition to the development and review processes outlined above, CMS submitted the 25-item
version of the HCAHPS instrument to the National Quality Forum (NQF), a voluntary consensus
standard-setting organization established to standardize healthcare quality measurement and
reporting, for its review and endorsement. NQF endorsement represents the consensus of numerous
healthcare providers, consumer groups, professional associations, purchasers, federal agencies, and
research and quality organizations. Following a thorough, multi-stage review process, HCAHPS was
endorsed by the NQF board in May 2005. In the process, NQF recommended a few modifications to
the instrument. As a result of the recommendation of the National Quality Forum Consensus
Development Process, the two courtesy and respect items were added back into the survey. The
review committee felt that these questions are important to all patients, but may be particularly
meaningful to patients who are members of racial and ethnic minority groups. The two reinstated
items are: “During this hospital stay, how often did nurses treat you with courtesy and respect?”, and
“During this hospital stay, how often did doctors treat you with courtesy and respect?”
Another recommendation from the NQF was to expand the response categories for the ethnicity
question in the “About You” section as follows:
No, not Spanish/Hispanic/Latino
Yes, Puerto Rican
9
Yes, Mexican America, Chicano
Yes, Cuban
Yes, other Spanish/Hispanic/Latino
Acting on another the recommendation of the National Quality Forum, CMS further examined the
costs and benefits of HCAHPS. This cost-benefit analysis of HCAHPS was independently conducted
by Abt Associates, Inc., and completed on October 5, 2005. The Executive Summary and Final
Report cost-benefit analysis, “Costs and Benefits of HCAHPS”, may be found at
http://www.cms.gov/HospitalQualityInits/ in the section called “HCAHPS: Patients’ Perspectives of
Care Survey.”
The accumulated lessons learned from the pilot testing, public comments, input from stakeholders,
numerous team discussions, and the National Quality Forum‟s review and endorsement through their
consensus development process led to the version of the HCAHPS survey that has 27 items and the
HCAHPS data collection protocol that allows hospitals to integrate their own specialized questions.
The resulting core questionnaire is comprised of questions in several dimensions of primary
importance to the target audience: doctor communication, responsiveness of hospital staff, cleanliness
of the hospital environment, quietness of the hospital environment, nurse communication, pain
management, communication about medicines, and discharge information. The 27-item HCAHPS
survey that was formally endorsed by the NQF may be found in Appendix A.
The HCAHPS implementation plan, in particular, changed significantly as a result of the public input
we received. CMS has made the following major changes in the implementation approach:
10
reduced the number of mailings for the “mail only” survey protocol from three to two;
reduced the number of follow-up phone calls for the “telephone only” survey protocol from
ten to five;
added active interactive voice response (IVR) as a mode of survey administration;
eliminated the cap on the number of hospital/vendor questions added to the HCAHPS items;
eliminated the 50% response rate requirement;
reduced the number of patient discharges to be surveyed.
Since national implementation began in 2006, CMS has continually refined and clarified HCAHPS
survey protocols, created new translations of the mail version of the survey (in Chinese, Russian and
Vietnamese), annually updated the HCAHPS Quality Assurance Guidelines (currently version 6.0;
www.hcahpsonline.org/qaguidelines.aspx), improved the appearance and accessibility of HCAHPS
results on the Hospital Compare website, and made information about HCAHPS quickly and easily
available through its official HCAHPS On-Line website, www.hcahpsonline.org. In addition, the
HCAHPS Project Team has published several analyses of HCAHPS results in peer-reviewed
scientific journals and made numerous presentations at professional conferences.
After public reporting of hospitals HCAHPS results was inaugurated in March 2008, a growing
number of healthcare, consumer and professional organizations, state governments, media outlets and
others have adopted or incorporated HCAHPS scores, in part or in whole, for their own purposes.
These activities, external to CMS, have had the effect of extending knowledge about HCAHPS and
increasing the impact of survey results.
Finally, the content the HCAHPS survey, its methodology and administration protocols, and its
ambition to measure and publicly report consumers‟ experiences in a uniform and standardized
11
manner have influenced other surveys developed within CMS as well as those undertaken by
hospitals and healthcare systems in the United States and abroad.
There are distinct roles for hospitals, or their survey vendors, and the federal government in the
national implementation of HCAHPS. The government is responsible for:
support and public reporting, including:
conducting training on data collection and submission procedures;
providing on-going technical assistance;
ensuring the integrity of data collection;
accumulating HCAHPS data from individual hospitals;
producing patient-mix adjusted hospital-level estimates;
conducting research on the presentation of data for public reporting; and,
publicly reporting the comparative hospital data.
Hospitals or their survey vendors are responsible for data collection, including: developing a
sampling frame of relevant discharges, drawing the sample of discharges to be surveyed, collecting
survey data from sampled discharges, and submitting HCAHPS data to CMS in a standard format.
We have formatted the data files so hospitals/vendors will submit to CMS de-identified data files
following 45 CFR Section §164.514. As they currently do, hospitals will maintain business associate
agreements with their survey vendors. Hospitals securely submit their HCAHPS data to CMS through
QualityNet Exchange.
CMS started collaboration with the Health Services Advisory Group (HSAG) in 2003 to coordinate
the national implementation of the Hospital CAHPS Survey. HSAG‟s role is to provide technical
12
assistance and training for vendors and hospitals, data validation, data processing, analysis, and
adjustment, and oversight of self-administering hospitals and survey vendors. HSAG also produces
electronic data files and a hospital level extract file for public reporting of the HCAHPS scores.
In the spring of 2006 CMS conducted a large-scale experiment to assess the impact of mode of survey
administration, patient characteristics and patient non-response on HCAHPS results. This Mode
Experiment was based on a nationwide random sample of short-term acute care hospitals. Hospitals
from each of CMS' ten geographic regions participated in the Mode Experiment. A hospital's
probability of being selected for the sample was proportional to its volume of discharges, which
guaranteed that each patient would have an equal probability of being sampled for the experiment.
The participating hospitals contributed patient discharges from a four-month period: February,
March, April, and May 2006. Within each hospital, an equal number of patients was randomly
assigned to each of the four modes of survey administration. Sample selection and surveying were
conducted by the National Opinion Research Center of the University of Chicago, and the data was
analyzed by the RAND Corporation.
A randomized mode experiment of 27,229 discharges from 45 hospitals was used to develop
adjustments for the effects of survey mode (Mail Only, Telephone Only, Mixed mode, or Active
Interactive Voice Response) on responses to the HCAHPS survey. In general, patients randomized to
the Telephone Only and Active Interactive Voice Response provided more positive evaluations than
patients randomized to Mail Only and Mixed (Mail with Telephone follow-up) modes. These mode
effects varied little by hospital, and were strongest for global items (rating and recommendation), and
the Cleanliness & Quiet, Responsiveness, Pain Management, and Discharge Information composites.
Adjustments for these mode effects are necessary to make the reported scores independent of the
13
survey mode that was used. These adjustments are applied to HCAHPS results before they are
publicly reported on the Hospital Compare website. The mode adjustments can be found on in the
“Mode Adjustment” section of the HCAHPS website, www.hcahpsonline.org. The algorithm for the
patient-mix adjustment can be found in Appendix B.
The Mode Experiment also provided valuable information on the impact of salient patient
characteristics and non-response bias on HCAHPS results. This analysis was needed because
hospitals do not provide care for comparable groups of patients but, as demonstrated in the HCAHPS
Three-State Pilot Study, some patient characteristics may affect measures of patient experiences of
care. The goal of patient-mix adjustment, which is also known as case-mix adjustment, is to estimate
how different hospitals would be rated if they provided care to comparable groups of patients. As
suggested by the Three-State Pilot Study, a set of patient characteristics not under control of the
hospital was selected for analysis. In summary, the most important patient-mix adjustment items
were patients‟ self-reported health status, education, service line (maternity, medical or surgical care)
and age. Patient-mix effects were generally less potent than survey mode. In addition, after mode
and patient-mix adjustments have been made, non-response effects were found to be negligible. A
report on patient-mix and non-response adjustments is available on the HCAHPS On-Line website,
www.hcahpsonline.org. The algorithm for the patient-mix adjustment can be found in Appendix B.
We also looked at the extent to which each domain contributes to measurement in priority areas
established by an independent, expert body on quality measurement, the National Quality Forum
(NQF). The HCAHPS domains “communication with doctors”, “communication with nurses”,
“communication about medications” will contribute to the NQF‟s priority on improving care
coordination and communication. The HCAHPS “pain control” domain will contribute to the NQF‟s
14
pain management priority, while the HCAHPS “discharge information” domain will contribute to the
priority on improving self-management and health literacy.
CMS, with assistance from HSAG and its sub-contractor, the National Committee on Quality
Assurance (NCQA), has developed and conducted two training programs for self-administering
hospitals and survey vendors participating in the HCAHPS survey, as well as others interested in this
program. HCAHPS Introductory Training was first offered at the CMS headquarters in January
2006, and then by webinar in January and April 2006. Since then, HCAHPS Introductory Training
has been held annually, by webinar. In addition, CMS developed the HCAHPS Update Training
program. HCAHPS Update Training was first offered in May 2007 and has been offered annually by
webinar since then. HCAHPS Introductory Training is required for self-administering hospitals and
survey vendors that wish to join HCAHPS. HCAHPS Update Training provides information on
important changes to the HCAHPS program and is required for all self-administering hospitals and
survey vendors participating in HCAHPS.
Self-administering hospitals and survey vendors may conduct a “dry run” of the HCAHPS survey to
gain experience before they begin to fully participate.
Using the official survey instrument and the
approved modes of implementation and data collection protocols, self-administering hospitals and
survey vendors collect HCAHPS data for eligible patients and submit it to CMS via QualityNet
Exchange. In addition to providing real experience, participation in the dry run produces HCAHPS
data that the hospital can use for self-assessment. Data collected for the dry run are not publicly
reported.
15
HCAHPS dry runs also proved beneficial to CMS by providing a test of its HCAHPS-related systems
and a check on the usability of the HCAHPS protocols. Based upon hospitals‟ experience in the first
dry run in Spring 2006, CMS made several improvements to the data coding and submission
processes and decided to exclude several small categories of patients from eligibility for logistical
reasons. These patient categories are: patients who were prisoners, who were discharged to hospice
care, who had a foreign home address, or who upon admission had requested that the hospital not
survey them.
Voluntary collection of HCAHPS data for public reporting began in October 2006. The first public
reporting of HCAHPS results, which encompassed eligible discharges from October 2006 through
June 2007, occurred in March 2008. HCAHPS results are posted on the Hospital Compare website,
found at www.hospitalcompare.hhs.gov, or through a link on www.medicare.gov. Subsequent public
reporting of HCAHPS results are based on four quarters of discharges, with the oldest quarter rolling
off when the most recent quarter rolls on. A downloadable version of HCAHPS results is also
available through this website. The number of hospitals that publicly report HCAHPS results has
increased from 2,521 in March 2008 to 3,812 in July 2011. Additional HCAHPS results can be found
on HCAHPS On-Line, (www.hcahpsonline.org), including a summary of state and national results,
the “top-box” and “bottom-box” percentiles for the ten HCAHPS measures, a table of intercorrelations of the HCAHPS measures, a set of charts that show HCAHPS results by hospital
characteristics (region, bed size, teaching status, ownership and location), and a bibliography of
HCAHPS research publications.
16
The enactment of the Deficit Reduction Act of 2005 created an additional incentive for acute care
hospitals to participate in HCAHPS. Since July 2007, hospitals subject to the Inpatient Prospective
Payment System (IPPS) annual payment update provisions ("subsection (d)
hospitals") must collect and submit HCAHPS data in order to receive their full IPPS annual payment
update. IPPS hospitals that fail to publicly report the required quality measures, which include the
HCAHPS survey, may receive an annual payment update that is reduced by 2.0 percentage points. NonIPPS hospitals, such as Critical Access Hospitals, may voluntarily participate in HCAHPS.
17
The Patient Protection and Affordable Care Act of 2010 (P.L. 111-148) includes HCAHPS
among the measures to be used to calculate value-based incentive payments in the Hospital
Value-Based Purchasing program (Section 3001), beginning with discharges in
October 2012. Thirty percent of the value-based incentive payment for participating IPPS
hospitals will be determined by performance on the HCAHPS Survey.
For each participating hospital, ten HCAHPS measures (six summary measures, two
individual items and two global items) are publicly reported on the Hospital Compare website, www.hospitalcomp
Each of the six summary measures, or composites, is
constructed from two or three survey questions. Combining related questions into composites
allows consumers to quickly review patient experience of care data and increases the
statistical reliability of these measures. The six composites summarize how well nurses and
doctors communicate with patients, how responsive hospital staff are to patients‟ needs, how
well hospital staff help patients manage pain, how well the staff communicates with patients
about medicines, and whether key information is provided at discharge. The two individual
items address the cleanliness and quietness of patients‟ rooms, while the two global items
report patients‟ overall rating of the hospital, and whether they would recommend the hospital
to family and friends. Survey response rate and the number of completed surveys, in broad
ranges, are also publicly reported. Table 2 lists the questions that comprise the two global
ratings, six composites and two individual items that are publicly reported for HCAHPS.
18
Table 2. HOSPITAL CAHPS SURVEY Global Ratings and Reporting Composites
Global Ratings
Response Format
Overall Rating of Hospital
Q21
Using any number from 0 to 10, where 0 is the worst
0-10 scale
hospital possible and 10 is the best hospital possible,
what number would you use to rate this hospital
Recommendation
Q22
Would you recommend this hospital to your friends and family?
definitely no, probably no,
probably yes, definitely yes
Composites and Individual Items
Response Format
Communication with Nurses
Q1
During this hospital stay, how often did nurses treat you with courtesy and respect?
never, sometimes, usually, always
Q2
During this hospital stay, how often did nurses listen
never, sometimes, usually, always
carefully to you?
Q3
During this hospital stay, how often did nurses explain
never, sometimes, usually, always
things in a way you could understand?
Communication with Doctors
Q5
During this hospital stay, how often did doctors treat you with courtesy and respect?
never, sometimes, usually, always
Q6
During this hospital stay, how often did doctors listen
never, sometimes, usually, always
carefully to you?
Q7
During this hospital stay, how often did doctors explain
things in a way you could understand?
never, sometimes, usually, always
Cleanliness of the physical environment
Q8
During this hospital stay, how often were your room
and bathroom kept clean?
never, sometimes, usually, always
Quietness of the physical environment
Q9
During this hospital stay, how often was the area
never, sometimes, usually, always
around your room quiet at night?
Responsiveness of Hospital Staff
Q4
During this hospital stay, after you pressed the call button
how often did you get help as soon as you wanted it?
Q11
never, sometimes, usually, always
How often did you get help in getting to the bathroom or in
using a bedpan as soon as you wanted?
never, sometimes, usually, always
Pain Management
Q13
During this hospital stay, how often was your pain well
never, sometimes, usually, always
controlled?
Q14
During this hospital stay, how often did hospital staff
do everything they could to help you with your pain?
never, sometimes, usually, always
Communication about Medications
Q16
Before giving you any new medicine, how often did
never, sometimes, usually, always
hospital staff tell you what the medicine was for?
Q17
Before giving you any new medicine, how often did
never, sometimes, usually, always
hospital staff describe possible side effects in a way
you could understand?
Discharge Information
Q19
During this hospital stay, did hospital staff talk with you
about whether you would have the help you needed
yes, no
when you left the hospital?
Q20
During your hospital stay, did you get information in
yes, no
Writing about what symptoms or health problems to look
out for after you left the hospital?
19
In addition to the questions that are used to form the HCAHPS measures of care from the patient‟s
perspective, there are also four screener items (not listed in Table 2). These items are included to
ensure that only appropriate respondents answer the questions that follow. For example, there is a
screener item that asks whether during the hospital stay you were given any medicine that you had
not taken before. Only respondents who answer “yes” are directed to answer the two following
questions about how often hospital staff explained what the medicine was for and possible side
effects. We considered dropping the screener questions; however, CAHPS testing has shown without
screeners respondents are more likely to give an inappropriate response rather than checking off “not
applicable.” It was also thought that this potential error would be further exacerbated by having
different modes of administration.
The psychometric performance of the domain-level composites with respect to reliability and
construct validity is summarized in Table 3. The basic unit of reporting for HCAHPS measure is the
hospital. Thus, we focus on the psychometric performance of the composites at the hospital level.
Hospital-level reliability reflects the extent to which variation in scores on a composite reflects
variation between hospitals, as opposed to random variation in patient response within hospitals. We
want measures whose variation reflects differences in performance between hospitals. The hospitallevel reliability reported in Table 3 is calculated as [1 – (1/F)], where F is the F-statistic for testing for
differences among hospitals. The numerator of the F-statistic summarizes the amount of variation
among the means for different hospitals on the measure in question. The denominator summarizes
the amount of random variation expected in these means due to sampling of patients. If there were no
real differences among hospitals, such that all of the differences were due to random variations in the
reports of patients who happened to answer the survey, the hospital-level reliability would be 0.0.
The more real differences among hospitals, relative to random variation, the larger the hospital-level
20
reliability is expected to be (up to a maximum of 1.0). Note that as we increase sample sizes, the
measures become more precise, so the amount of random variation becomes smaller and the hospitallevel reliability becomes larger.
There is no fixed level that reliability needs to reach for a measure to be useful, but 0.7 is a
commonly used rule-of-thumb. The hospital-level reliabilities of communication with doctors (0.76),
communication with nurses (0.89), responsiveness of hospital staff (0.81), cleanliness and quiet of the
physical environment (0.77), and discharge information (0.75) are all above the rule-of-thumb. Pain
control (0.62) and communication about medicines (0.68) are a little lower.
Construct validity represents the extent to which a measure relates to other measures in the way
expected. If the proposed domains are important factors in quality for consumer choice, we would
expect that hospitals with high scores on the composites would also have high scores on patient
willingness to recommend the hospital and the overall rating. That is, the composites should be
positively correlated at the hospital level with the willingness to recommend the hospital and the
overall rating of the hospital. Again, there is no fixed level that these correlations need to reach for
the composite to be useful, but 0.4 is a reasonable rule-of-thumb. The hospital-level correlations
between the composites and willingness to recommend the hospital were all above this level. The
correlation was 0.54 for communication with doctors, 0.76 for communication with nurses, 0.70 for
responsiveness of hospital staff, 0.68 for cleanliness and quiet of the physical environment, 0.72 for
pain control, 0.73 for communication about medicines, and 0.53 for discharge information. There
was a similar pattern for the correlations of the composites with the overall hospital rating. Values
ranged from 0.81 for communication with nurses, to 0.57 with discharge information.
21
TABLE 3
HOSPITAL CAHPS SURVEY DOMAIN-LEVEL COMPOSITES,
INDICATORS OF PSYCHOMETRIC PERFORMANCE
Psychometric performance
Domain-level composite
Communication with doctors
Communication with nurses
Responsiveness of hospital staff
Cleanliness and quiet of environ.
Pain control
Communication about meds.
Discharge information
Hospital-level
reliability
Construct validity
Hospital-level
correlation with
Hospital-level
willingness to
correlation with
Recommend
overall rating
0.76
0.89
0.81
0.77
0.62
0.68
0.75
0.54
0.76
0.70
0.68
0.72
0.73
0.53
0.59
0.81
0.75
0.75
0.76
0.65
0.57
Note: Data from 3-state pilot study (130 hospitals, 19,683 discharges).
It should be noted, though, that the composites themselves are correlated, some highly so (see Table
4). One question is the extent to which each domain-level composite has an independent effect on
willingness to recommend the hospital and overall rating of the hospital. To examine this, AHRQ ran
regressions of willingness to recommend the hospital and overall rating of the hospital with the seven
composites.
22
TABLE 4
HOSPITAL-LEVEL CORRELATIONS AMONG DOMAIN COMPOSITES
Comm.
Comm.
with doctors
with nurses
Respons.
of hosp.
staff
1.00
0.55
0.62
0.39
0.82
0.68
0.80
1.00
0.91
0.64
0.82
0.79
0.65
1.00
0.64
0.85
0.83
0.73
Comm. with doctors
Comm. with nurses
Respons. of hosp. staff
Clean. and quiet. of env.
Pain control
Comm. about medicines
Discharge information
Cleanliness
and
quietness
of
environment
control
Comm.
about
medicines
1.00
0.59
0.64
0.39
1.00
0.81
0.76
1.00
0.69
Pain
Discharge
info.
1.00
Data from 3-state pilot study (130 hospitals, 19,683 discharges).
The results of these regressions are shown in Table 5. Because of the high collinearity among the
composites, only communication with nurses and pain control had statistically significant effects in
the equation for willingness to recommend the hospital. Communication with nurses, pain control,
and communication about medicines had statistically significant effects in the equation for the overall
rating of the hospital.
23
TABLE 5
HOSPITAL-LEVEL REGRESSIONS USING DOMAIN COMPOSITES
Willingness to recommend hospital
Domain-level composite
Communication with doctors
Communication with nurses
Responsiveness of hospital staff
Cleanliness and quiet of envir.
Pain control
Communication about meds.
Discharge information
Overall rating of hospital
Parameter
estimate
Standard
error
p-value
Parameter
estimate
Standard
error
p-value
-0.296
0.627
0.046
0.073
0.536
0.114
0.126
0.160
0.197
0.120
0.103
0.160
0.086
0.213
0.067
0.001
0.706
0.480
0.001
0.189
0.555
-0.516
1.559
0.138
0.312
1.112
0.403
0.485
0.315
0.389
0.237
0.204
0.316
0.170
0.420
0.103
0.000
0.562
0.128
0.001
0.019
0.251
R-square=0.669 for willingness to recommend hospital.
R-square=0.787 for overall rating of hospital.
Data from 3-state pilot study (130 hospitals, 19,683 discharges).
To further examine this issue, AHRQ conducted a hospital-level factor analysis just with items from
the reduced version of the questionnaire. This analysis extracted three factors. A factor that might
be called “nursing services” was formed that combined the communication with nurses,
responsiveness of hospital staff, and communication about medicines composites. A factor that might
be labeled “physician care” was formed that combined the communication with doctors, pain control,
and discharge information composites. The third factor was the same as the current cleanliness and
quiet of the hospital environment composite. AHRQ then ran regressions of willingness to
recommend the hospital and overall rating of the hospital with these three factors. The results are
presented in Table 6. The nursing services and physician care factors were strong predictors in both
the equation for willingness to recommend the hospital and the equation for overall rating of the
hospital. The effect of cleanliness and quiet of the environment was statistically significant in the
overall rating equation.
24
25
TABLE 6
HOSPITAL-LEVEL REGRESSIONS USING COMBINED FACTORS
Willingness to recommend hospital
Factor
Nursing services a/
Physician care b/
Cleanliness and quiet of
environment
Overall rating of hospital
Parameter
estimate
Standard
error
p-value
Parameter
estimate
Standard
error
p-value
0.574
0.709
0.146
0.247
0.000
0.005
1.613
1.761
0.291
0.492
0.000
0.001
0.142
0.103
0.172
0.426
0.205
0.040
R-square=0.619 for willingness to recommend hospital.
R-square=0.751 for overall rating of hospital.
Data from 3-state pilot study (130 hospitals, 19,683 discharges).
a/ Combines communication with nurses, responsiveness of hospital staff, and communication about medicines.
b/ Combines communication with doctors, pain control, discharge information.
Establishing domains that are important for public reporting is based on the psychometric
characteristics of the measures and the utility of the information from perspective of consumers. The
input we have received suggests that the seven composites are valuable. Thus, the original composite
structure was maintained to best support consumer choice.
Another set of items is included on the survey for patient-mix adjustment and other analytic purposes.
The goal of HCAHPS is to collect information from patients using the HCAHPS survey and to
present the information based on those surveys to consumers, providers and hospitals. One of the
methodological issues associated with making comparisons between hospitals is the need to adjust
appropriately for patient-mix differences. Patient-mix refers to patient characteristics that are not
under the control of the hospital that may affect measures of patient experiences, such as
demographic characteristics and health status. The basic goal of adjusting for patient-mix is to
26
estimate how different hospitals would be rated if they all provided care to comparable groups of
patients.
As discussed earlier, there is an adjustment for the hospital reports to control for patient
characteristics that affect ratings and are differentially distributed across hospitals. Most of the
patient-mix items are included in the “About You” section of the instrument, while others are taken
from administrative records. Based on the mode experiment, and consistent with previous studies of
patient-mix adjustment in CAHPS and in previous hospital patient surveys, we will be using the
following variables in the patient-mix adjustment model:
Self-reported general health status (specified as a linear variable)
Education (specified as a linear variable)
Type of service (medical, surgical, or maternity care)
Age (specified as a categorical variable)
Admission through emergency room
Lag time between discharge and survey
Age by service line interaction
Language other than English spoken at home
Once the data are adjusted for patient-mix, there is a fixed adjustment for each of the reported
measures for mode of administration (discussed in detail below). The patient-mix adjustment
employs a regression methodology, also referred to as covariance adjustment.
On the survey, there are two additional questions to capture the race and ethnicity of the respondent.
These are not included in the patient-mix adjustment model but included as analytic variables to
27
support the congressionally mandated “National Healthcare Disparities Report” and “National
Healthcare Quality Report.” These reports provide annual, national-level breakdowns of HCAHPS
scores by race and ethnicity. Many hospitals collect information on race and ethnicity through their
administrative systems, but coding is not standard. Thus it was determined that administrative data
are not adequate to support the analyses needed for the reports and the items should be included in the
questionnaire.
In response to numerous requests from within CMS, the Department of Health and Human Services,
and external stakeholders we plan to add five new items to the HCAHPS Survey beginning with July
2012 discharges. Three new items will be added to the „core‟ of the survey in order to create the new
Care Transition Composite measure, and two new items will be added to the “About You” section of
the survey: one concerning emergency room admission, and one concerning self-reported overall
mental health.
The three-item Care Transition Measure (CTM) will measure patients‟ perspectives on the
coordination of hospital discharge care. The CTM was developed by Dr. Eric A. Coleman of the
University of Colorado, a leading researcher in the area of care transitions, and has been endorsed
by the NQF as a voluntary consensus standard. The three new survey items are presented in
Table 7. The new CTM items will complement the two existing HCAHPS discharge questions,
provide detailed information on the transition to post-hospital care, and incentivize quality
improvement in this critical aspect of care.
In addition to the three CTM items, CMS plans to add two additional survey items to the “About You”
section of the HCAHPS Survey. The first is an item that asks patients whether they entered the hospital
through the emergency room. The second is an item that asks patients about their overall mental health.
28
Each of these items conform to CAHPS standards and had been tested in the
HCAHPS Three-State
Pilot Study in 2003. However, neither item was ultimately adopted in the national implementation of
HCAHPS, primarily to keep the survey as short as possible. At that time, emergency room admission
information could be obtained from hospital administrative data, and patients‟ overall mental health
status was found to have relatively little impact on patient-mix adjustment in the Three-State Pilot Study.
At this time, CMS would like to add these items to the HCAHPS Survey.
Until 2010, “emergency room admission” as a point of origin for hospital patients was an
administrative code that was provided by hospitals. CMS employed this administrative code as a
patient-mix adjustment for HCAHPS scores. However, effective July 1, 2010, the “Point of Origin
for Admission or Visit” code for Emergency Room (code 7) was discontinued for use by Medicare
Systems and thus became unavailable for HCAHPS. CMS believes that emergency room admission
is a key element of HCAHPS patient-mix adjustment but has been unable to find any other hospital
administrative code that could be used to substitute for this purpose. The best alternative CMS has
found is a patient-reported item that had been developed for HCAHPS and was tested in 2003 (see
Table _x). This patient-reported emergency room admission item will be added to the “About You”
section of the HCAHPS Survey beginning in July 2012.
Although we chose not to add a survey item about patient‟s overall mental health status in the
national implementation of HCAHPS in 2006, we continue to receive inquiries and requests from
hospitals and researchers on this topic. Some researchers claim that mental health status is an
important factor in how patients respond to HCAHPS survey items. The continuing interest in this
topic, coupled with the direct impact of HCAHPS performance on hospital payments beginning in
October 2012, led us to decide to add an overall mental health item to the HCAHPS survey. The
29
overall mental health survey item we have chosen (see Table 7) very closely resembles the Overall
General Health item in the HCAHPS Survey, has been extensively tested, and is currently included in
several other CAHPS surveys. CMS plans to add this overall mental health item to the “About You”
section of the HCAHPS Survey in July 2011. We provide a copy of the revised HCAHPS Survey in
Appendix C, which shows the content and placement of the five additional survey items.
30
TABLE 7
NEW HCAHPS SURVEY ITEMS, JULY 2012
Care Transition Measure
Q 23: CTM 1
During this hospital stay, staff took my preferences
and those of my family or caregiver into account
in deciding what my health care needs would
be when I left.
Q24: CTM 2
When I left the hospital, I had a good
understanding of the things I was
responsible for in managing my health.
Q25: CTM 3
When I left the hospital, I clearly
understood the purpose for taking
each of my medications.
Response Format
Strongly disagree, Disagree, Agree, Strongly
agree
Strongly disagree, Disagree, Agree, Strongly
agree
Strongly disagree, Disagree, Agree, Strongly
agree; I was not given any medication when
I left the hospital
Emergency Room Admission item
Q26: ER
During this hospital stay, were you
admitted to this hospital through the
Emergency Room?
Yes, No
Overall Mental Health item
Q28: MH
In general, how would you rate your
overall mental or emotional health?
Excellent, Very good, Good, Fair, Poor
Since its national implementation in 2006, HCAHPS has contained the same 27 survey items.
Because
we will add five new items in 2012, it is necessary to conduct a new mode experiment in order to
investigate how each survey mode (HCAHPS permits mail, telephone, mail with telephone follow-up,
and active Interactive Voice Response modes) affects responses to the
31
new CTM items. The structure of the new mode experiment will closely resemble the original HCAHPS
mode experiment, which is described elsewhere in the statement.
The new mode experiment will include approximately 50 hospitals from around the USA that represent
key hospital characteristics, A sample of patients from each selected hospital would
be randomly assigned to each mode, survey results compared, and mode effects estimated to adjust for
their impact on survey response. CMS will use its HCAHPS subcontractors RAND and Health Services
Advisory Group to conduct the mode experiment and analyze the data.
Hospitals will have the opportunity to collect the new HCAHPS survey items on a voluntary basis
beginning in July 2012. As part of the FY 2013 Inpatient Prospective Payment System rule, CMS
will propose to add the new HCAHPS Survey items to the Annual Payment Update (APU)
requirements of IPPS hospitals. To implement the augmented HCAHPS survey, CMS will train
participating hospitals and survey vendors on the new items, as well as update survey materials,
telephone scripts, and file specifications for these items. Similar modifications will be required to
accommodate the public reporting of the CTM composite.
32
A 1.2 Survey Approach
The HCAHPS survey is administered in English and Spanish to a random sample of adult patients,
with at least one overnight stay, discharged from an acute care hospital. Psychiatric and pediatric
patients are excluded from the survey, as well as patients who were prisoners, who were discharged
to hospice care, who had a foreign home address, or who at admission requested the hospital not to
survey them (“no publicity” patients). In addition, self-administering hospitals and survey vendors
have the option of offering the mail version of the HCAHPS survey in Chinese, Russian and
Vietnamese, though using these instruments requires permission from CMS.
CMS requires that, following training, approved hospitals or vendors administer the
HCAHPS survey either as (a) a stand-alone survey or (b) integrated with the hospital‟s existing
survey. If the survey is integrated with an existing survey, HCAHPS‟ two global ratings, two
individual items , and the items that constitute the six domains must be placed at the beginning of the
questionnaire. Participating hospitals may place items from their current survey after these HCAHPS
core items (questions 1-22). HCAHPS demographic items (questions 23-27) may be placed
anywhere in the questionnaire after the core items. Other than this permitted insertion of nonHCAHPS items, the order, wording and response categories of the 27 HCAHPS items may not be
altered in any way. CMS suggests that the hospital/vendor use transitional phrasing such as the
following to transition from the HCAHPS items to the hospital-specific ones:
“Now we would like to gather some additional detail on topics we have asked you about
before. These items use a somewhat different way of asking for your response since they are
getting at a little different way of thinking about the topics”
Flexibility in the mode of administering the survey is permitted. The hospitals/vendors may use any
of the following modes: telephone only, mail only, a mixed methodology of mail with telephone
follow-up, or active interactive voice response (IVR). All modes of administration require following
33
a standardized protocol. Quality assurance guidelines have been developed for each mode of survey
administration detailing issues related to protocol, handling of the questionnaires and other materials,
training, interviewing systems, attempts, monitoring and oversight.
Modes of Survey Administration
Mail Only
For the mail only option, the hospital/vendor is required to send the HCAHPS questionnaire, alone or
combined with hospital-specific questions, along with a cover letter, between 48 hours and 6 weeks
following discharge. CMS provides sample cover letters to hospitals/vendors in its training program
for HCAHPS. The hospitals/vendors may tailor their letters, but the letters must contain information
about the purpose of the survey, and that participation in the survey is voluntary and will not affect
their patients‟ health care benefits.
The hospital/vendor must send a second questionnaire with a reminder/thank you letter to those not
responding approximately 21 days after the first mailing. Data collection would be closed out for a
particular respondent within 21 days following the mailing of the second questionnaire.
Telephone Only
For the telephone only option, the hospital/vendor is required to begin data collection between 48
hours and six weeks following discharge. The hospital/vendor must attempt to contact respondents
up to 5 times unless the respondent explicitly refuses to complete the survey. These attempts must be
made on different days of the week and different times of the day and in different weeks to ensure
that as many respondents are reached as feasible. Data collection is closed out for a particular
respondent 42 days following the first telephone attempt.
34
For the HCAHPS portion of the survey, CMS is providing scripts to follow for the telephone
interviewing in both English and Spanish. The interviewers conducting the survey must be trained
before beginning interviewing. In its training program for HCAHPS CMS provides guidance on how
to train interviewers to conduct HCAHPS. The training program emphasizes that interviewers read
questions as worded, use non-directive probes, maintain a neutral and professional relationship with
the respondent, and that interviewers record only the answers that the respondents themselves choose.
Mixed Mode
A third option is a combination of mail and telephone. In this mixed mode of administration, there is
one wave of mailing (cover letter and questionnaire) and up to five telephone call-back attempts for
non-respondents. The first survey is sent out between 48 hours and 6 weeks following discharge.
Telephone follow-up is initiated for all non-respondents approximately 21 days after the initial
questionnaire mailing. The telephone attempts must be made on different days of the week and
different times of the day, and in different weeks to ensure that as many respondents are reached as
feasible. Telephone interviewing must end 6 weeks after the first survey mailing. Similar to the
telephone only mode, CMS provides telephone scripts for the hospitals/vendors to follow.
Active IVR
For active IVR, hospitals/vendors must initiate data collection by phone between 48 hours to six
weeks following discharge. A live interviewer asks the respondent if she/he is willing to complete
the survey using the IVR system. Through the IVR system respondents completes the survey using
the touch-tone keypad on their phone. The hospital/vendor is required to provide an option for the
respondent to opt out of the system and return to a live interviewer.
35
Similar to the telephone mode, the hospital/vendor must call each respondent up to 5 times unless the
respondent refuses to complete the survey. These attempts must be made on different days of the
week and different times of the day, and in different weeks to ensure that as many respondents are
reached as feasible. Data collection must be closed out for a particular respondent 42 days following
the first telephone attempt.
Sampling
A description of sampling for HCAHPS follows, including the basic structure of sampling, population
and sampling frame, and sampling approach.
Basic structure of sampling
We received public input regarding sampling. The majority of respondents preferred to sample
discharges on a more continuous basis (i.e., a monthly basis) and cumulate these samples to create
rolling estimates based on 12-months of data. We chose to pursue the more continuous sampling
approach for the following reasons:
It is more easily integrated with many hospitals‟ existing survey processes used for internal
improvement.
Improvements in hospital care can be more quickly reflected in hospital scores (e.g., 12month estimates could be updated on a quarterly or semi-annual basis).
Hospital scores are less susceptible to unique events that could affect hospital performance at
a specific point in time (e.g., a temporary shortage of nursing staff that could adversely affect
the hospital‟s score if the survey was done only once during the year at the same time as the
shortage).
It is less susceptible to gaming (e.g., hospitals being on their best behavior just around the
survey).
By sampling and collecting data on a monthly basis, it is not necessary to reach so far back in
time in order to obtain a sufficient number of discharges to meet the sample size requirements.
36
For these reasons, the basic structure of sampling for HCAHPS entails drawing a sample of relevant
discharges on a monthly basis. Data will be collected from patients in each monthly sample and will
be cumulated to create a rolling 12-month data file for each hospital. Hospital-level scores for the
HCAHPS measure will be produced using 12 months of data. After the initial 12 months of data
collection, the hospital level scores will be updated on a quarterly basis utilizing the most recent 12
months worth of data.
Population and sampling frame
HCAHPS is designed to collect data on care from the patient‟s perspective for general acute care
hospitals. Pediatric, psychiatric, and other specialty hospitals are not included (additional/different
aspects of care need to be examined for these specialty settings). Within general acute care hospitals,
HCAHPS scores are designed to reflect the care received by patients of all payers (not just Medicare
patients). Specifically, this includes the population of non-psychiatric and non-pediatric patients who
had an overnight stay in the hospital and were alive when discharged. Patients who did not have an
overnight stay are excluded because we don‟t want to include patients who had very limited inpatient
interaction in the hospital (e.g., patients who were admitted for a short period for purely observational
purposes; patients getting only outpatient care are not included in HCAHPS). Patients who died in
the hospital are excluded because HCAHPS is designed to capture information from the perspective
of the patient himself or herself, thus proxy respondents are not permitted. For logistical reasons, we
exclude several other categories of patients from eligibility: those who were prisoners, who were
discharged to hospice care, who had a foreign home address, or who at admission had requested the
hospital not to survey them (“no publicity” patients).
37
CMS designed the HCAHPS survey with the intention of capturing the views of the broadest sample
of patients discharged from short-term, acute care hospitals. Therefore, categorical exclusions of
patients from the HCAHPS survey are few and are based on CMS policy decisions. Patients will be
excluded only when the survey does not properly apply to them (pediatric patients below age 18, and
psychiatric patients), or when they have died in hospital or prior to being surveyed.
While a wide spectrum of patients are eligible for participation in HCAHPS, personal identities are
not asked for or revealed. Eligible discharged patients are randomly surveyed by hospitals or their
designated survey vendor. We have designed the data files so all protected health information (PHI)
are de-identified {See 45 CFR §164.514 (de-identification of PHI)} before transmission to CMS.
Thus, a patient‟s personal identity is not transmitted to, revealed to, or used by CMS. It is our intent
that hospitals and survey vendors submit a thoroughly de-identified data set through the Quality Net
Exchange. The data is analyzed by the Health Services Advisory Group, a Quality Improvement
Organization. Hospital-level data is then transmitted to CMS for public reporting.
The monthly sampling frame that defines the population for a given hospital includes all discharges
between the first and last days of the month, with the exclusions noted above. CMS periodically
reviews the patient sampling process employed by participating hospitals to assure that patients are
being properly included or excluded, and that patients are being surveyed at random. Some states
have further restrictions on patients who may be contacted. Hospitals/vendors should exclude other
patients as required by law or regulation in the state in which they operate.
Sampling approach
Hospitals and survey vendors have several options for the monthly sampling of eligible discharges,
including: a simple random sample (which includes a census of all eligible discharges); a
38
proportionate stratified random sample (PSRS); or a disproportionate stratified random sample
(DSRS). The simple random sample is considered normative. Hospitals/survey vendors must receive
an exception in order to use a disproportionate stratified random sample, as well as provide
information about the name, number of eligible discharges, and number of sampled discharges of
each stratum.
In the simple random sample approach, the hospital/vendor will draw a simple random sample each
month from the sampling frame of eligible discharges. Sampling can be done at one time after the
end of the month or continuously throughout the month as long as a simple random sample is
generated for the month (the hospital/vendor can chose what works best with their current survey
activities for internal improvement).
As noted above, monthly data are cumulated to produce a rolling 12-month data file for each hospital
that will be used to produce hospital-level scores for the HCAHPS measure. A target for the
statistical precision of these scores is based on the reliability criterion. Higher reliability means
higher “signal to noise” in the data. The target is that the reliability for the global ratings and most
composites be 0.8 or more. Based on this, it is necessary for each hospital to obtain 300 completed
HCAHPS questionnaires over a 12 month period. Small hospitals that are unable to reach the target
of 300 completes in a given 12-month period should sample all eligible discharges and attempt to
obtain as many completes as possible.
In order to calculate the number of discharges that need to be sampled each month to reach this target,
it is necessary to take into account the proportion of sampled patients expected to complete the survey
(not all sampled patients who are contacted to complete the survey will actually complete it; the
39
target is 300 completes). This is a function of the proportion of sampled patients who turn out to be
ineligible for the survey and the survey response rate among eligible respondents. The calculation of
the sample size needed each month can be summarized as follows:
Step 1 – Identify the number of completes needed over 12 months
C = number of completes needed = 300
Step 2 – Estimate the proportion of sampled patients expected to complete the survey
Let:
I = expected proportion of ineligible sampled patients
R = expected survey response rate among eligible respondents
P = proportion of sampled patients expected to complete the survey
= (1 – I) x R
Step 3 – Calculate the number of discharges to sample
N12 = sample size needed over 12 months = C / P
N1 = Sample size needed each month = N12 / 12
Some small hospitals will not be able to reach the target of 300 completes in a given 12-month
period. In such cases, the hospital should sample all discharges and attempt to obtain as many
completes as possible. All hospital results are publicly reported. However, the lower precision of the
scores derived from fewer than 300 completed surveys are noted in the public reporting. CMS also
publicly reports the response rate achieved by each hospital participating in HCAHPS and, roughly,
its number of completed surveys (“less than 100”, “100 to 299”, and “300 or more”).
B 1.0 Need and Legal Basis
40
HCAHPS is part of the Hospital Quality Alliance, a private/public partnership that includes the
American Hospital Association, the Federation of American Hospitals, and the Association of
American Medical Colleges, Joint Commission on Accreditation of Healthcare Organizations,
National Quality Forum, AARP, and CMS/AHRQ. Working together, the group wants to 1) provide
useful and valid information about hospital quality to the public; 2) provide hospitals a sense of
predictability about public reporting expectations; 3) begin to standardize data and data collection
mechanisms; and 4) foster hospital quality improvement. Clinical information was made available on
the Internet at www.cms.gov in the summer of 2003, and was added to the Hospital Compare
website, www.hospitalcompare.hhs.gov, in April 2005. HCAHPS results were added to this website
beginning in March 2008.
In addition, beginning with fiscal year 2008 participation in HCAHPS became part of the Reporting
Hospital Quality Data Annual Payment Update (RHQDAPU) program. Hospitals
that are subject to the inpatient prospective payment system (IPPS) provisions (RHQDAPU-eligible
"subsection (d) hospitals") must meet the new reporting requirements in order to
receive their full IPPS annual payment update (APU). Hospitals that do not participate in the
RHQDAPU initiative, which includes the new HCAHPS reporting requirements, could receive
a reduction of 2.0 percent in their Medicare Annual Payment Update for fiscal year 2008. Non-IPPS
hospitals can voluntarily participate in HCAHPS.
The recently enacted Patient Protection and Affordable Care Act of 2010 (P.L. 111-148)
41
includes HCAHPS among the measures to be used to calculate value-based incentive payments in the
Hospital Value-Based Purchasing program, beginning with discharges in October 2012 (see Section
3001).
B 2.0 Purpose and Use of Information
Three broad goals have shaped HCAHPS. First, the survey is designed to produce comparable data
on the patient‟s perspective on care that allows objective and meaningful comparisons between
hospitals on domains that are important to consumers. Second, public reporting of the survey results
is designed to create incentives for hospitals to improve their quality of care. Third, public reporting
serves to enhance public accountability in health care by increasing the transparency of the quality of
hospital care provided in return for the public investment. With these goals in mind, the HCAHPS
project has taken substantial steps to assure that the survey is credible, useful, and practical. This
methodology and the information it generates are made available to the public.
There are many excellent patient surveys currently in use by hospitals. However, most of them are
proprietary and are not constructed in a way that would allow patient assessment of hospital care
across the country. That is, the instruments are not standardized. With a standardized instrument
consumers will be able to make “apples to apples” comparisons among hospitals, allow hospitals and
hospital chains to self compare, and increase the public accountability of hospitals.
A standardized instrument, developed under the CAHPS umbrella, has produced a reliable and valid
instrument that any organization can use to obtain patient data about hospital experiences. This tool
was adopted by the Hospital Quality Alliance. Data collection began in October 2006, and the first
public reporting of participating hospitals‟ HCAHPS results occurred in March 2008.
42
B 3.0 Use of Improved Information Technology
Since CMS is committed to allowing flexibility in the administration of HCAHPS, one of the four
modes in administering the survey is active IVR. IVR, short for interactive voice response, a
telephony technology in which a respondent uses a touch-tone telephone to interact with a database to
acquire information from or enter data into the database. IVR technology does not require human
interaction over the telephone as the user's interaction with the database is predetermined by what the
IVR system will allow the user access to. For example, banks and credit card companies use IVR
systems so that their customers can receive up-to-date account information instantly and easily
without having to speak directly to a person. IVR technology is also used to gather information, as in
the case of telephone surveys in which the user is prompted to answer questions by pushing the
numbers on a touch-tone telephone. Patients selected for this mode will however be able to opt out of
the interactive voice response system and to return to a “live” interviewer.
B 4.0 Efforts to Identify Duplication
Many hospitals have used their own patient satisfaction or patient experience of care surveys. These
diverse, often proprietary surveys do not allow for useful comparisons across hospitals. Making
comparative performance information available to the public can help consumers make more
informed choices when selecting a hospital and can create incentives for hospitals to improve the care
they provide.
Adding own items
Vendors/hospitals have the option to add their own questions to the HCAHPS core questionnaire. If a
hospital/vendor wishes to add their own questions, they must add them after the core HCAHPS items
(questions 1- 22). The “About You” section of HCAHPS items can be placed after the core HCAHPS
items, or follow the hospital-specific items. If a hospital/vendor decides to add their own questions,
43
they should pay attention to the length of the questionnaire. The longer the questionnaire is, the
greater the burden on respondents.
Hospitals participating in HCAHPS will submit to CMS only the data collected for HCAHPS
purposes. Hospitals will retain possession of all of data they collect from the HCAHPS survey and
may analyze it according to their own needs. The analysis that CMS performs is done for the
purposes of public reporting.
To promote its wide and rapid adoption, HCAHPS has been carefully designed to fit within the
framework of patient satisfaction surveying that hospitals currently employ. Still, CMS fully
understands that participation in the HCAHPS initiative will require some effort and expense on the
part of hospitals that volunteer to take part. CMS has and will continue to provide mandatory training
for HCAHPS (for no fee) to fully inform hospitals and survey vendors about implementation,
reporting, and other issues. In addition, hospitals may fully trial HCAHPS (but with no public
reporting of results) by conducting a “dry run” before they fully participate in HCAHPS.
CMS does not require that hospitals drop any items from their ongoing patient satisfaction surveys in
order to participate in the HCAHPS initiative. The content of the HCAHPS survey has been kept to
the minimum number of items necessary to fulfill the objective of providing valid and comparable
information on topics of greatest importance to consumers. CMS does suggest that hospitals consider
removing current items that essentially duplicate HCAHPS items. CMS makes no recommendations
with respect to hospital items that are unrelated to HCAHPS domains, or to patient surveys targeted at
those not eligible for HCAHPS.
44
Similarly, CMS does not require or recommend that participating hospitals abandon their customized
surveys of specific patient groups. The recommended number of completed HCAHPS surveys (at
least 300 per year from a random sample of eligible patients; 100 from smaller hospitals) can be
accommodated within the surveying plans of most hospitals that conduct patient satisfaction surveys.
A hospital‟s participation in HCAHPS does not mean that it can or should survey only enough of its
HCAHPS-eligible patients to produce 300 completed surveys per year. At its discretion, a hospital
may submit more than 300 completed HCAHPS surveys, sample patients not eligible for HCAHPS,
stratify patients according to its own criteria, or over-sample certain types of patients. During
training CMS presents detailed information on sampling for HCAHPS, including how the HCAHPS
sample can be accommodated within the stratified sampling schemes employed by hospitals.
Detailed information on sampling and other important aspects of administering the HCAHPS survey
can be found in the current version of the HCAHPS Quality Assurance Guidelines, which is available
on our website, www.hcahpsonline.org. In addition, policy and protocol updates, training
announcements, etc. are regularly posted on this website.
There are a couple of additional issues related to integrating surveys: mixing response options on
HCAHPS with the differing response options on the current vendor surveys, and revisiting domains
that have already been covered earlier in the questionnaire. There are ways to handle this issue by
using transitional phrasing between the HCAHPS questions and the hospital-specific ones. An
example of such phrasing is as follows:
“Now we would like to gather some additional detail on topics we have asked you about before.
These items use a somewhat different way of asking for your response since they are getting at a little
different way of thinking about the topics”
45
B 5.0 Involvement of Small Entities
We realize that some small hospitals that wish to voluntarily participate in HCAHPS will not be able
to reach the target of 300 completes in a given 12-month period. In such cases, the hospital should
sample all discharges and attempt to obtain as many completes as possible. HCAHPS results from
such hospitals will be publicly reported, but the lower stability of scores based on fewer than 300
completed surveys will be noted.
According to the Abt Associates cost and benefits report, the cost for a small hospital to have 100
complete surveys ranges from $1,000 - $1,500 if HCAHPS is implemented as a stand-alone survey.
It will cost approximately $326 annually if it is integrated into an existing survey.
B 6.0 Consequences if Information Collected Less Frequently
We spent a great deal of time considering how often HCAHPS data should be collected and have
solicited feedback from the public on this very issue. We received a lot of feedback on this issue
from the June 27, 2003 Federal Register Notice. Two options for frequency of data collection were
suggested: once during the year, and continuous. The majority of respondents to this notice
suggested that continuous sampling would be easier for them to integrate into their current data
collection processes. We have decided to require sampling of discharges on a more continuous basis
(i.e., a monthly basis) and cumulating these samples to create rolling estimates based on 12-months of
data. We chose to pursue the continuous sampling approach for the following reasons:
It is more easily integrated with many existing survey processes used for internal
improvement.
Improvements in hospital care can be more quickly reflected in hospital scores (e.g., 12month estimates could be updated on a quarterly or semi-annual basis).
Hospital scores are less susceptible to unique events that could affect hospital performance at
a specific point in time.
46
It is less susceptible to gaming (e.g., hospitals being on their best behavior just around the
survey).
There is less variation in time between discharge and data collection.
B 7.0 Special Circumstances
There are no special circumstances with this information collection request.
B 8.0 Federal Register Notice/Outside Consultation
A 60-day Federal Register notice was published on July 13, 2007. Throughout the HCAHPS
development process, CMS has solicited and received a great deal of public input. As a result, the
HCAHPS questionnaire and methodology have gone through several iterations. The first was a 66item version that was tested in a three-state pilot study (this was developed for testing purposes; we
never intended that the final version be this long). Prior to the start of the pilot test, a Federal Register
notice was published in February 2003 soliciting input on the proposed pilot study. This notice
produced nearly 150 comments. Based on results of the pilot study, the questionnaire was reduced to
32 items. CMS received additional feedback from a Federal Register Notice published in June 2003
that sought further comment on the survey and implementation issues while the initial pilot testing
was underway. CMS received 110 responses to the notice from hospital associations, provider
groups, consumers/purchasers, and hospital survey vendors.
A 32-item version of the HCAHPS instrument was published in the Federal Register in December
2003 for public comment; CMS received nearly 600 comments that focused on the following topics:
sampling, response rate, implementation procedures, cost issues, the length of the instrument,
exclusion categories, question wording, and reporting.
47
CMS, AHRQ, the CAHPS grantees and members of the Instrument, Analysis, and Cultural
Comparability teams met via conference calls two to three times per week to review all comments
from the three Federal Register Notices and to modify the survey instrument and implementation
strategy based on the comments. The input we received conflicted. Survey vendors and many
hospitals indicated that the 32-item version was too long, while consumer groups indicated that the
full content was needed to support consumer choice. After the comments were reviewed, CMS and
AHRQ held additional discussions with hospitals, vendors and consumers to discuss the comments
received.
Using the comments received from the public and stakeholders, and the psychometric analysis of the
data from the 3-state pilot study, CMS reduced the next version of the HCAHPS questionnaire to 25
items. The following questions were eliminated from the longer versions of the questionnaire: overall
rating of the doctor; overall rating of the nurse; how often did doctors treat you with courtesy and
respect; how often did nurses treat you with courtesy and respect; overall mental health status; and
two items related to whether and how a proxy helped complete the survey. The item regarding how
often hospital staff ask if you were allergic to any medicine was also eliminated from the survey. A
question from the original 66-item survey was used to replace the allergy question. This newly reintroduced item asks, “Before giving you the medicine, how often did hospital staff tell you what the
medicine was for?” (Question 14 on the 25-item survey) in response to public input, we have also
eliminated the reference to doctors in the screener question that identifies patients who needed help
getting to the bathroom or using a bedpan (Question 8 on the 25-item survey).
The questions on overall mental health status and the two proxy questions were dropped because their
impact in the patient-mix adjustment was negligible. Questions about being shown courtesy and
48
respect by doctors and nurses were pulled from the latest version because input received indicated
that the two items on doctors or nurses “explaining things fully” and “listening carefully” were
synonymous with showing courtesy and respect. Taking these questions out of these composites did
not adversely impact the psychometric properties of the doctor and nurse communication composites.
The allergy question originally had a scale of “never to always‟ that didn‟t work well because the
question is usually asked once upon admission. Changing the scale to a “yes/no” response provided
very little variation across hospitals.
Following a thorough, multi-stage review process, HCAHPS was endorsed by the National Quality
Forum (NQF) board in May 2005. In the process, NQF recommended a few modifications to the
instrument. As a result of the recommendation of the National Quality Forum Consensus
Development Process, the two courtesy and respect items were added back into the survey. The
review committee felt that these questions are important to all patients, but may be particularly
meaningful to patients who are members of racial and ethnic minority groups. The two reinstated
items were: “During this hospital stay, how often did nurses treat you with courtesy and respect?”,
and “During this hospital stay, how often did doctors treat you with courtesy and respect?”.
B 9.0 Payments/Gifts to Respondents
There are no provisions for payments or gifts to respondents.
B 10.0 Assurance of Confidentiality
All information obtained through the survey is reported in the aggregate. No individual respondents‟
information is reported independently or with identifying information.
We have designed the data files so that the hospital/vendor submits a de-identified dataset to CMS
through a QIO according to 45 CFR Section § 164.514.
49
In all the modes of survey administration, guidelines are included on issues related to confidentiality:
Cover letters are not to be attached to the survey
Respondents‟ names are not to appear on the survey
Interviewers are not to leave messages on answering machines or with household member
since this could violate a respondent‟s privacy
B 11.0 Information of a Sensitive Nature
There are no questions of a sensitive nature on the survey.
B 12.0 Estimates of Annualized Burden
National Implementation
In the most recent public reporting of HCAHPS results on Hospital Compare in July 2011,
approximately 2,710,000 patients discharged between October 2009 and September 2010 completed
the HCAHPS survey. On average, it will take respondents 8 minutes (0.13333 hours) to complete the
expanded survey, for a total 361,324 hours annual burden.
To calculate the cost per response, we employ the mean hourly wage rate of $21.35 from “National
Compensation Survey: Occupational Wages in the United States, May 2010,” U.S. Department of Labor,
Bureau of Labor Statistics. Thus, the annual cost burden of the survey is $7,714,267.
Approximately 3,812 hospitals participated in the most recent public reporting period. Using Abt
Associates‟ estimate of approximately $4,000 per hospital for HCAHPS data collection (see below), the
annual cost burden is $15,248,000. Assuming one hour per hospital, the annual burden for hospitals is
3,812 hours.
50
In total, the annual cost burden of the survey is (Patients $7,714,267) + (Hospitals $15,248,000) =
$22,962,267, and the annual hour burden is (Patients 361,324 hours) + (Hospitals 3,812 hours) =
365,136 hours.
B 13.0 Capital Costs
Hospitals have the option to conduct HCAHPS as a stand-alone survey or to integrate it with their
existing survey activities. They can choose to administer HCAHPS by mail, phone, or active IVR.
Costs associated with collecting HCAHPS will vary depending on:
o
The method hospitals currently use to collect patient survey data;
o
The number of patients surveyed (target is 300 completed surveys per year); and
o
Whether it is possible to incorporate HCAHPS into their existing survey.
Abt estimates that the average data collection cost for a hospital conducting the 27-item version of
HCAHPS as a stand-alone survey would be between $3,300 - $4,575 per year, assuming that 80-85
percent of hospitals collect HCAHPS by mail and the remainder by phone or active IVR. The costs
for combining HCAHPS with existing surveys would be considerably less. It would cost $978 per
hospital annually to incorporate the 27-item version of HCAHPS into existing surveys.
Abt Associates estimates that the nationwide data collection cost for conducting the 27-item version
of HCAHPS would be approximately $15 million per year if all eligible hospitals participated,
depending upon the extent to which hospitals integrate HCAHPS with their existing survey activities
or conduct it as a stand-alone survey.
51
Approximately 3,812 hospitals participated in the most recent public reporting period. Using Abt
Associates‟ estimate of approximately $4,000 per hospital for HCAHPS data collection, the annual cost
burden is $15,248,000.
B 14.0 Estimates of Annualized Cost to the Government
Costs to the government are for training, technical assistance, ensuring the integrity of the data,
approving hospitals/vendors for HCAHPS; accumulating the data; analyzing the data; making
adjustments for patient-mix and mode of administration; and public reporting. The annual cost to the
Federal Government is estimated to be $2,500,000.
B 15.0 Changes in Burden
All changes in burden are due to agency discretion. These result from the additional time (1 minute)
to complete the expanded survey, an increase in the number of surveys completed by patients, an
increase in the number of participating hospitals, and an increase in the mean hourly wage rate since
our previous submission in 2010.
The estimated amount of time to complete the expanded HCAHPS survey is 8 minutes, compared to
7 minutes in our 2010 submission, an increase of 14.3%. Approximately 2,710,000 completed
surveys were included in the July 2011 public reporting on Hospital Compare, an increase of 230,000
(9.3%) compared to our 2010 submission. In addition, 3,812 hospitals reported their HCAHPS
scores, which is an increase of 37 (1%) compared to our 2010 submission. To calculate the cost per
response, we employ the mean hourly wage rate of $21.35 from “National Compensation Survey:
Occupational Wages in the United States, May 2010,” U.S. Department of Labor, Bureau of Labor
Statistics. Thus, the estimated annual cost burden of the survey is $15,248,000 for hospitals, which is an
increase of $148,000 (1%) compared to our 2010 submission, and $7,714,267 for patients, an increase of
52
$2,054,737 (36.3%). The annual hour burden of the survey for patients is 361,324, an increase of
71,982 hours (24.9%) compared to our 2010 submission.
B 16.0 Time Schedule, Publication, and Analysis Plans
Since October 2006, the HCAHPS survey has been administered on a continuous basis, and since
March 2008, HCAHPS results have been publicly reported on the Hospital Compare website four
times per year. This pattern will continue into the foreseeable future. In addition, we plan to
continue to offer both HCAHPS Introductory Training and HCAHPS Update Training on an annual
basis.
B 17.0 OMB Expiration Date Exemption
CMS would like an exemption from displaying the expiration date on this collection as the survey
will be used on a continuing basis. To include an expiration date would result in having to discard a
potentially large number of surveys.
B 18.0 Exceptions to Certification Statement
The proposed data collection does not involve any exceptions to the certification statement identified
in line 19 of OMB Form 83-I.
53
File Type | application/pdf |
File Title | Justification of the Hospital CAHPS Survey |
Author | CMS |
File Modified | 2011-09-30 |
File Created | 2011-09-30 |