CMS-10102 - Supporting Statement B [07-30-2018 at noon]

CMS-10102 - Supporting Statement B [07-30-2018 at noon].docx

National Implementation of Hospital Consumer Assessment of Health Providers and Systems (HCAHPS) (CMS-10102)

OMB: 0938-0981

Document [docx]
Download: docx | pdf

April 2018






National Implementation of the Hospital Consumer Assessment of Healthcare Providers and Systems

(HCAHPS) Survey


CMS 10102




OMB Supporting Statement - Part B







Prepared by


Division of Consumer Assessment & Plan Performance Centers for Medicare & Medicaid Services

7500 Security Boulevard

Baltimore, MD 21244



1

TABLE OF CONTENTS

Section Page

Introduction…………………………………………………………………………………………. 4

  1. Respondent Universe and Sampling………………………………………………………… 4

2. Data Collection Procedures…………………………………………………………………. 5

    1. Statistical Methodology for Stratification and Sample Selection……………………… 5

b. Estimation Procedures…………………………………………………………………. 7

  1. Degree of Accuracy Needed for the Purpose Described in the Justification…………… 7

  2. Unusual Problems Requiring Specialized Sampling Procedures……………………….. 7

  3. Any Use of Periodic Data Collection Cycles to Reduce Burden……………………….. 7

  1. Maximizing Response Rates/Non-response and Issues of Accuracy,

Reliability and Validity........................................................................................................... 8

  1. Past and Ongoing Tests of Procedures, Training, and Quality Improvement Activities; Collaborator and Contractor Participation…………………………………………………… 12

  2. Names and Telephone Numbers of Individuals Consulted on Statistical Aspects

of the Survey and Implementation Design and Names of Agency Units, Contractor(s), Grantee(s), or Other Persons(s) who Collect and/or Analyze the Information for the

Agency…………………………………………………………….………………………… 18

TABLES

Table 1. Hospital-Level Spearman-Brown Reliabilities of HCAHPS Top-Box Scores of HCAHPS Measures at 300 Completed Surveys, 3.1 million discharges, January – December 2016. ……………………… 9

Table 2. Cronbach’s Alphas of HCAHPS Composite Measures at 300 Completed Surveys, 2.9 million discharges, July 2015 - June 2016.………………………………………………………………….. 9

Table 3. HCAHPS Patient-level Correlations, 3.1 million discharges, July 2015 – June 2016. 10

Table 4. Hospital Top-Box Correlations for HCAHPS Measures, July 2015 – June 2016 discharges (4,304 hospitals, 3.1 million surveys). …………………………………………………… 12

Table 5. HCAHPS Survey Development and Implementation Timeline, 2002-2008…………….. 13


LIST OF ATTACHMENTS


Attachment A -- HCAHPS Survey Instrument (Mail) and Supporting Material (included with Supporting Statement - Part A)



Attachment B -- HCAHPS Survey Instrument (Telephone) and Supporting Material (included with Supporting Statement - Part A)



Attachment C -- HCAHPS Survey Instrument (Interactive Voice Response) and Supporting Material (included with Supporting Statement - Part A)

OMB SUPPORTING STATEMENT – Part B:

National Implementation of the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) Survey


CMS-10102


Introduction


The Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey is administered to a random sample of adult inpatients between 48 hours and six weeks after discharge. Patients admitted in the medical, surgical and maternity care service lines are eligible for the survey. HCAHPS is not restricted to Medicare patients. Hospitals may use an approved survey vendor or collect their own HCAHPS data, if approved by CMS to do so. HCAHPS can be implemented in four survey modes: mail, telephone, mail with telephone follow-up, or active interactive voice recognition (IVR), each of which requires multiple attempts to contact patients. Hospitals must survey patients throughout each month of the year. IPPS hospitals must achieve at least 300 completed surveys over four calendar quarters. HCAHPS is available in official English, Spanish, Chinese, Russian, Vietnamese, and Portuguese versions. The HCAHPS Survey sampling protocol promotes the following: 1) standardized administration of the HCAHPS Survey by hospital/survey vendors and 2) comparability of resulting data across all participating hospitals. The survey and its protocols for sampling, data collection, coding and submission can be found in the HCAHPS Quality Assurance Guidelines manual (Version 13.0, March 2018) on the official HCAHPS On-Line Web site, www.hcahpsonline.org.


  1. Respondent Universe and Sampling

The HCAHPS Survey is broadly intended for patients of all payer types who meet the following criteria:

    • Eighteen (18) years or older at the time of admission

    • Admission includes at least one overnight stay in the hospital

    • Non-psychiatric MS-DRG/principal diagnosis at discharge

    • Alive at the time of discharge


There are a few categories of otherwise eligible patients who are excluded from the sample frame. These are:


    • No-Publicity” patients – Patients who request that they not be contacted

    • Court/Law enforcement patients (i.e., prisoners); patients residing in halfway houses are included

    • Patients with a foreign home address (U.S. territories – Virgin Islands, Puerto Rico, Guam, American Samoa, and Northern Mariana Islands are not considered foreign addresses and are not excluded)

    • Patients discharged to hospice care (Hospice-home or Hospice-medical facility)

    • Patients who are excluded because of state regulations

    • Patients discharged to nursing homes and skilled nursing facilities


Hospitals/Survey vendors must retain documentation that verifies all exclusions and ineligible patients for a minimum of three years. This documentation is subject to review.


Hospitals/Survey vendors participating in HCAHPS are responsible for generating complete, accurate, and valid sample frame data files each month that contain all administrative information on all patients who meet the eligible population criteria. The following steps must be followed when creating the sample frame:

    • The sample frame for a particular month must include all eligible hospital discharges between the first and last days of the month (e.g., for January, any qualifying discharges between the 1st and 31st)

    • If a hospital is conducting sampling at the end of each month, they must create the sample frame in a timely manner in order to initiate contact for all sampled patients within 42 days of discharge

    • Patients with missing or incomplete addresses and/or telephone numbers must not be removed from the sample frame. Instead, every attempt must be made to find the correct address and/or telephone number.

    • Patients whose eligibility status is uncertain must be included in the sample frame


The hospital/survey vendor must retain the sample frame (i.e., the entire list of eligible HCAHPS patients from which each hospital’s sample is pulled) for 3 years. Confidentiality note: Patient-identifying information within the sample frame will not be part of the final data submitted to CMS, nor will any other PHI.


Hospitals must submit at least 300 completed HCAHPS Surveys in a rolling four-quarter period (unless the hospital is too small to obtain 300 completed surveys). The absence of a sufficient number of HCAHPS eligible discharges is the only acceptable reason for submitting fewer than 300 completed HCAHPS Surveys in a rolling four-quarter period. In that not all sampled patients who are contacted to complete the survey will actually do so, guidance is provided hospitals/survey vendors as to how many discharges are needed to reach the required 300 completed surveys per four rolling quarters of data (a 12- month reporting period).


  1. Data Collection Procedures

  1. Statistical Methodology for Stratification and Sample Selection.

The basic sampling procedure for HCAHPS entails drawing a random sample of all eligible discharges from a hospital on a monthly basis. Sampling may be conducted either continuously throughout the month or at the end of the month, as long as a random sample is generated from the entire month. The HCAHPS sample must be drawn according to this uninterrupted random sampling protocol and not according to any “quota” system. Hospitals/Survey vendors must sample from every month throughout the entire 12-month reporting period and not stop sampling or curtail ongoing survey administration activities even if 300 completed surveys have been attained.


Sampling for HCAHPS is based on the eligible discharges (HCAHPS sample frame) for a calendar month. If every eligible discharge for a given month has the same probability of being sampled, then an equiprobable approach is being used. Stratified sampling is where eligible discharges are divided into non-overlapping subgroups, referred to as strata, before sampling.


There are three options for sampling patients for the HCAHPS Survey: Simple Random Sampling (SRS), Proportionate Stratified Random Sampling (PSRS), and Disproportionate Stratified Random Sampling

(DSRS). Once a sample type is used within a quarter, it must be maintained throughout that quarter; “Sample Type” can only be changed at the beginning of a quarter. For more information about HCAHPS sampling, please see HCAHPS Quality Assurance Guidelines, V13.0, pp. 55-78, at www.hcahpsonline.org/en/quality-assurance/.

    • SRS: Simple Random Sampling is the most basic sampling type; patients are randomly selected from all eligible discharges for a month. Strata are not used when employing SRS and each patient has equal opportunity of being selected into the sample, making SRS equiprobable. Census sampling is considered a form of simple random sampling.

    • PSRS: Proportionate Stratified Random Sampling uses strata definitions and random sample selection from all strata at equal rates. Since the sampling rates of the strata are “proportionate,” PSRS is also considered equiprobable.

    • DSRS: Disproportionate Stratified Random Sampling involves sampling within strata at different rates, and thus, DSRS requires information about the strata. By definition, DSRS is not an equiprobable sampling approach as DSRS allows for dissimilar sampling rates across strata. DSRS means that all eligible discharges do not have an equal chance of being selected for inclusion in the monthly sample. To account for this, CMS requires additional information from hospitals and survey vendors who choose to use DSRS as a sampling type. Hospitals/survey

vendors must submit an Exceptions Request Form and then be approved to use DSRS.



  1. Estimation Procedures.

Not applicable to the HCAHPS Survey.



  1. Degree of Accuracy Needed for the Purpose Described in the Justification.

IPPS hospitals are expected to achieve at least 300 completed surveys over a 12 month period, if possible, to attain the desired degree of accuracy; please see below. HCAHPS scores based on fewer than 100 or 50 completed surveys are publicly reported but the lower reliability of these scores is noted with an appropriate footnote in public reporting. HCAHPS scores based on fewer than 25 completed surveys are not reported on Hospital Compare. However, these hospitals do receive their HCAHPS scores in their confidential Preview Reports. IPPS hospitals must achieve at least 100 completed HCAHPS surveys during the 12 month Performance Period in order for a Patient and Caregiver Centered Experience of Care/Care Coordination domain score to be calculated in the Hospital Value-Based Purchasing program.

The HCAHPS Quality Assurance Guidelines, V13.0, pp. 61-63, describe how to calculate the sample size necessary to attain at least 300 completed surveys, www.hcahpsonline.org/en/quality-assurance/. Elliot, et al. provide additional information about the validity and reliability of the HCAHPS Survey; see Elliott, Lehrman, et al. (2010). “Do Hospitals Rank Differently on HCAHPS for Different Patient Subgroups?” Medical Care Research and Review, 67(1):56- 73.


  1. Unusual Problems Requiring Specialized Sampling Procedures.

CMS recognizes that some small hospitals may not have 300 HCAHPS-eligible discharges in the course of a year and so are unable to obtain 300 completed surveys in a 12-month report period. In such cases, IPPS hospitals must sample all eligible discharges (that is, conduct a census) and attempt to obtain as many completes as possible


  1. Any Use of Periodic Data Collection Cycles to Reduce Burden.

There is no use of periodic (less frequent than annual) data collection cycles for the HCAHPS Survey. Great effort was expended considering how often HCAHPS data should be collected. We solicited and received much comment on this issue when the HCAHPS Survey was being developed. Two options for the frequency of data collection were suggested: once during the year or continuous sampling. The majority of hospitals/vendors suggested continuous sampling would be easier to integrate into their current data collection processes. Thus we decided to require sampling of discharges on a continuous

basis (i.e., a monthly basis) and cumulate these samples to create rolling estimates based on 12-months of data. We chose to pursue the continuous sampling approach for the following reasons:

  • It is more easily integrated with many existing survey processes used for internal improvement,

  • Improvements in hospital care can be more quickly reflected in hospital scores (e.g., 12-month estimates could be updated on a quarterly or semi-annual basis),

  • Hospital scores are less susceptible to unique events that could affect hospital performance at a specific point in time,

  • It is less susceptible to gaming (e.g., hospitals being on their best behavior at the time of an annual survey), and

  • There is less variation in time between discharge and data collection.


  1. Maximizing Response Rates/Non-response and Issues of Accuracy, Reliability, and Validity.


Implementation of the HCAHPS Survey according to the protocols contained in the current HCAHPS Quality Assurance Guidelines, V13.0, helps hospital to attain the greatest response rate. CMS examines hospital response rates every quarter and works with hospitals and survey vendors whose response rate is significantly below the national average. Among other tactics, CMS strongly encourages hospitals to offer the HCAHPS Survey in the language spoken at home by their patients.

Analysis of HCAHPS data indicates that the patient-mix adjustment applied to survey results adequately addresses the non-response bias that would exist without patient-mix adjustment; see “The Effects of Survey Mode, Patient Mix, and Nonresponse on CAHPS Hospital Survey Scores.” Elliott, Zaslavsky et al. (2009) Health Services Research, 44 (2): 501-518.


Information on statistical tests of the original 27-item HCAHPS Survey can be found in the original 2006 OMB package. Here we focus on the most recent tests of the validity and reliability of the current, 32-item HCAHPS Survey at the recommended 300 completed surveys in a 12-month public reporting period.


In terms of hospital-level reliability, the signal-to-noise ratio indicates how much of what you measure is “signal” (true variation in performance), rather than “noise” (measurement error). Unit or hospital-level reliability (Spearman-Brown reliability) is the proportion of variance in hospital-level scores that reflect true variation among hospitals, not noise due to limited numbers of patient surveys; see Table 1.

Table 1: Hospital-Level Spearman-Brown Reliabilities of HCAHPS Top-Box Scores of HCAHPS Measures at 300 Completed Surveys, 3.1 million discharges, January – December 2016.


Communication with Nurses

0.87

Communication with Doctors

0.86

Responsiveness of Hospital Staff

0.92

Communication about Pain

0.87

Communication about Medicines

0.83

Cleanliness

0.87

Quietness

0.93

Discharge Information

0.83

Overall Rating

0.91

Recommend Hospital

0.93

Care Transition

0.86


Generally accepted standards for reliability are: 0.7: Adequate (all HCAHPS measures exceed this); 0.8: Good (all exceed this); 0.85 Very good (9 of 11 reach this); and 0.9: Excellent (4 of 11 reach this level). It should be noted that at more than the recommended 300 completed surveys in 12 month reporting period, HCAHPS reliability is even higher.


Evidence of the internal consistency of the HCAHPS Survey can be found in Table 2, which displays the Cronbach’s Alpha statistics for the seven publicly reported HCAHPS composite measures, which are made up of two or three survey items.


Table 2: Cronbach’s Alphas of HCAHPS Composite Measures at 300 Completed Surveys,

2.9 million discharges, July 2015 - June 2016.



Communication with Nurses

0.87

Communication with Doctors

0.88

Responsiveness of Hospital Staff

0.73

Pain Management

0.83

Communication about Medicines

0.69

Discharge Information

0.51

Care Transition

0.81



Criterion validity indicates the extent to which HCAHPS measures what it purports to measure: patient experience of care. The patient-level correlation of specific HCAHPS measures with “Hospital Rating” and “Recommend the Hospital” can be used to assess validity (criterion validity); see Table 3. We expect

moderate, positive correlations of the other nine HCAHPS measures with patient experience of care

constructs. If the correlations were too high, this may indicate redundancy or halo effect. If too low, would indicate lack of validity. We find moderate, positive correlations between the nine specific HCAHPS measures and the two global measures (Rating and Recommendation). All 18 of these correlations are between 0.28 and 0.64; 15 of 18 correlations equal or exceed 0.35.



Table 3: HCAHPS Patient-level Correlations, 3.1 million discharges, July 2015 – June 2016.



HCAHPS PATIENT-LEVEL CORRELATIONS*















Communication with Nurses


Communication with Doctors


Responsiveness of Hosp. Staff


Pain Management


Comm. About Medicines


Cleanliness of Hospital Env.


Quietness of Hospital Env.


Discharge Information


Care Transition


Hospital Rating


Recommend the Hospital

Communication with Nurses

1

0.52

0.56

0.56

0.50

0.38

0.32

0.27

0.44

0.64

0.58

Communication with Doctors


1

0.37

0.45

0.43

0.26

0.26

0.28

0.40

0.51

0.47

Responsiveness of Hosp. Staff



1

0.48

0.41

0.34

0.31

0.20

0.35

0.51

0.44

Pain Management




1

0.44

0.31

0.30

0.25

0.39

0.54

0.48

Comm. About Medicines





1

0.33

0.29

0.35

0.45

0.48

0.43

Cleanliness of Hospital Env.






1

0.27

0.18

0.27

0.41

0.36

Quietness of Hospital Env.







1

0.13

0.25

0.35

0.29

Discharge Information








1

0.30

0.30

0.28

Care Transition









1

0.48

0.45

Hospital Rating










1

0.76

Recommend the Hospital











1

*Patient-level Pearson correlations of rescaled linear means of HCAHPS measures, for patients discharged between July 2015



and June 2016 (3.1 million completed surveys).





















Note: All correlations are significant at p<0.001.





















*These results encompass all hospitals that received HCAHPS scores. Because not all hospitals report their results on Hospital



Compare, the values on that Web site may differ from those shown here.
















The correlations of the nine specific measures with the two global measures are larger at the hospital level than at the patient level, in part because of the better measurement at the hospital level (as intended) than at the patient level; see Table 4. These hospital-level correlations range from 0.40 to 0.73, with all but four equal to or exceeding 0.50.

Table 4: Hospital Top-Box Correlations for HCAHPS Measures, July 2015 – June 2016 discharges (4,304 hospitals, 3.1 million surveys).




Nurse


Doctor


Staff


Pain

RX

Comm


Clean


Quiet


Discharge


CTM


Recommend


Rating

Nurse

1











Doctor

0.75

1










Staff

0.78

0.66

1









Pain

0.71

0.62

0.65

1








RX Comm

0.71

0.62

0.65

0.62

1







Clean

0.68

0.56

0.68

0.53

0.55

1






Quiet

0.57

0.58

0.60

0.48

0.52

0.52

1





Discharge

0.48

0.36

0.38

0.39

0.41

0.33

0.18

1




CTM

0.70

0.60

0.60

0.58

0.57

0.56

0.43

0.47

1



Recommend

0.65

0.55

0.50

0.55

0.49

0.49

0.40

0.49

0.72

1


Rating

0.72

0.61

0.63

0.61

0.58

0.58

0.51

0.51

0.73

0.86

1



The goal of HCAHPS is to collect information from patients using a standardized, national survey and to present the information based on those surveys to consumers, providers and hospitals. One of the methodological issues associated with making comparisons between hospitals is the need to adjust appropriately for patient-mix differences. Patient-mix refers to patient characteristics that are not under the control of the hospital that may affect measures of patient experiences, such as demographic characteristics and health status. The basic goal of adjusting for patient-mix is to estimate how different hospitals would be rated if they all provided care to comparable groups of patients.


CMS applies patient-mix adjustment to control for patient characteristics that affect ratings and that are differentially distributed across hospitals. Most of the patient-mix items are included in the “About You” section of the instrument, while others are taken from administrative records. Based on the mode experiment, and consistent with previous studies of patient-mix adjustment in CAHPS and in previous hospital patient surveys, we employ the following variables in the patient-mix adjustment model:

  • Self-reported general health status (specified as a linear variable)

  • Education (specified as a linear variable)

  • Type of service (medical, surgical, or maternity care) by gender

  • Age (specified as a categorical variable)

  • Lag time between discharge and survey

  • Age by service line interaction

  • Language (English, Spanish, Chinese, Russian/Vietnamese/Portuguese/Other) spoken at home

Once the data are adjusted for patient-mix, there is a fixed adjustment for each of the reported measures for mode of administration (discussed in detail below). The patient-mix adjustment employs a regression methodology, also referred to as covariance adjustment.


On the survey there are two items that capture the race and ethnicity of the respondent. These items are not included in the patient-mix adjustment model but are used as analytic variables to support the congressionally-mandated report: “The Quality Disparities Report.” This report provides annual, national- level breakdowns of HCAHPS scores by race and ethnicity. Many hospitals collect information on race and ethnicity through their administrative systems, but coding is not standard. Thus it was determined that administrative data are not adequate to support the analyses needed for the reports and the items should be included in the questionnaire.


  1. Past and Ongoing Tests of Procedures, Trainings, and Quality Improvement Activities; Collaborator and Contractor Participation (collaborators and contractors are bolded).


Hospitals began using the HCAHPS Survey in 2006 under the auspices of the Hospital Quality Alliance, a private/public partnership that includes CMS and the American Hospital Association, the Federation of American Hospitals, and the Association of American Medical Colleges; the Joint Commission on Accreditation of Healthcare Organizations; the National Quality Forum; AARP; and CMS’ Agency for Healthcare Research and Quality (AHRQ).


Beginning in 2002, HCAHPS has undertaken extensive research, planning, development and testing activities. CMS originally partnered with AHRQ to develop a standard instrument and data collection and reporting procedures that capture patients’ perspectives of their hospital care. AHRQ is a leader in developing instruments for measuring patient perspectives on health care. Table 5 summarizes the process of creating and implementing the HCAHPS survey throughout the period starting July 2002 through the first public reporting of HCAHPS results in March of 2008.

Table 5: HCAHPS Survey Development and Implementation Timeline, 2002-2008.



Activity

Timeframe

Published a “call for measures” in the Federal Register and received 7 submissions from Avatar, Edge Health Care Research Healthcare Financial Management Association, Jackson Organization, Press Ganey Associates, National Research Corporation, Peace Health, Professional Research Consultants, and SSM Health Care.



July 2002

Completed literature review.

Sept-Nov 2002

Held a Web chat to answer questions about HCAHPS.

Oct 2002

Provided draft domains to CMS.

Oct 2002

Reviewed measures submitted in response to Federal Register Notice FRN).

Nov 2002

Held Stakeholders Meeting to solicit suggestions and comments.

Nov 2002

Held vendors meeting to solicit suggestions and comments.

Nov 2002

AHRQ delivered 66 item draft survey to CMS for use in pilot test.

Jan 2003

Developed data collection and sampling methods, and developed analysis plan.

Jan-Feb 2003

Published FRN soliciting comments on draft HCAHPS.

Feb 2003

Completed hospital recruitment for pilot.

Mar 2003

Began data collection for CMS 3-state pilot test.

June 2003

Published a FRN soliciting comments on draft HCAHPS and asked for input about implementation issues.

June 2003

Analyzed data from CMS pilot test.

Sept–Nov 2003

Review of instrument by CAHPS Cultural Comparability team.

Fall 2003

Began CT pilot test of the HCAHPS instrument.

Fall 2003

Held HCAHPS Stakeholders’ Meeting at AHRQ.

Nov 2003

Revised HCAHPS instrument to 32 items.

Nov 2003

AHRQ submitted revised 32-items HCAHPS Instrument to CMS.

Dec 2003

Published a FRN soliciting input for 32-item HCAHPS instrument and implementation strategy.

Dec 2003-Feb 2004

Started coordination of national implementation with HSAG, the AZ QIO.

January 2004

Completed CT pilot test of HCAHPS.

Jan 2004

AHRQ submitted Analysis Report of the CMS 3-state pilot to CMS.

Jan 2004

Continued discussions with hospitals, vendors, consumers to follow-up on FRN comments from February.

March – Sept 2004

Revised 25-item HCAHPS Instrument submitted by AHRQ to CMS.

Oct 2004

Submitted HCAHPS to NQF for its consensus development process.

November 2004

Started developing training documents for national implementation.

December 2004

Started discussions regarding data transmission via QNET & other issues with the IFMC, the IA QIO.

April 2005

Formed the Data Integrity Group.

June 2005

Received endorsement for 27-item HCAHPS from the National Quality

May 2005

Modified survey instrument and protocol as 27-items.

May 2005

Abt Associates, Inc. receives OMB approval for cost-benefit analysis.

June 2005

Established the Data Submission and Reporting Group.

July 2005

Abt Associates, Inc. submits final report of the cost-benefit analysis.

October 2005

Published FRN soliciting comments on draft CAHPS Hospital Survey.

November 2005

Received final approval.

December 2005

Mode Experiment.

February – May 2006

National Implementation begins.

October 2006

HCAHPS participation linked to RHQDAPU program (“pay for reporting”).

July 2007

First public reporting of HCAHPS results.

March 2008


Throughout the HCAHPS development process, CMS solicited and received a great deal of public input. As a result, the HCAHPS questionnaire and methodology went through several iterations prior to national implementation. The accumulated lessons learned from the pilot testing, public comments, input from stakeholders, numerous team discussions, and the National Quality Forum’s review and endorsement (NQF #0166) through their consensus development process led to the national implementation in 2006 of the 27-item HCAHPS Survey and the HCAHPS data collection protocol that allows hospitals to integrate their own supplemental questions. The resulting core questionnaire is comprised of questions in several dimensions of primary importance to the target audience: doctor communication, responsiveness of hospital staff, cleanliness of the hospital environment, quietness of the hospital environment, nurse communication, pain management, communication about medicines, and discharge information. The HCAHPS Survey was re-endorsed by the NQF in 2012 and 2015.



Responding to concerns raised by some stakeholders that the HCAHPS pain items create pressure on physicians to overprescribe opioids in hopes of achieving better survey results, CMS announced plans in the FY 2018 IPPS Final Rule to replace the original Pain Management questions (items 12, 13 and 14 on the HCAHPS Survey) with three new questions that focus on communication about pain. The new pain items will be required on all surveys administered to patients discharged from January 1, 2018 and forward. The new pain items will comprise a new composite measure, “Communication About Pain.”


The HCAHPS implementation plan changed significantly as a result of the public input we received prior


to national implementation. CMS made the following major changes in the implementation approach, which have served to lessen the burden on hospitals/survey vendors:

  • reduced the number of mailings for the “mail only” survey protocol from three to two;

  • reduced the number of follow-up phone calls for the “telephone only” survey protocol from ten to five;

  • added active interactive voice response (IVR) as a mode of survey administration;

  • eliminated the 50% response rate requirement;

  • reduced the number of patient discharges to be surveyed.


Since national implementation began in 2006, CMS has continually refined and clarified HCAHPS survey protocols, created new translations of the mail version of the survey in Chinese, Russian, Vietnamese, and Portuguese in addition to the original English and Spanish versions (in the telephone and IVR modes, the HCAHPS Survey is available in both English and Spanish), added five new survey items, including three items that form the new composite measure, Care Transition, annually updated the HCAHPS Quality Assurance Guidelines (currently version 13.0), improved the appearance and accessibility of HCAHPS results on the Hospital Compare website: https://www.medicare.gov/hospitalcompare/, and made information about HCAHPS quickly and easily available through its official HCAHPS On-Line website, www.hcahpsonline.org. There has been steady and significant improvement in HCAHPS scores since national implementation; see Elliott, Lehrman, Goldstein et al. (2010), “Hospital Survey Shows Improvements in Patient Experience.” Health Affairs, 29 (11): 2061-2067. The HCAHPS Project Team has published a number of analyses of HCAHPS results in peer-reviewed scientific journals and made numerous presentations at professional conferences; a bibliography of the publications can be found at: http://www.hcahpsonline.org/globalassets/hcahps/home/october_2016_home_bibliography.pdf.


Since public reporting of hospitals HCAHPS results was inaugurated in March 2008, a growing number of healthcare, consumer and professional organizations, state governments, media outlets and others have adopted or incorporated HCAHPS scores, in part or in whole, for their own purposes. These activities, external to CMS, have had the effect of extending knowledge about HCAHPS and increasing the impact of survey results. The content of the HCAHPS Survey, its methodology and administration protocols, and its ambition to measure and publicly report consumers’ experiences in a uniform and standardized manner have influenced other surveys developed within CMS as well as those undertaken by hospitals and healthcare systems in the United States and abroad.


There are distinct roles for hospitals or their survey vendors and the federal government in the national

implementation of HCAHPS. The federal government is responsible for support and public reporting, including:

    • conducting training on data collection and submission procedures,

    • providing on-going technical assistance,

    • ensuring the integrity of data collection,

    • accumulating HCAHPS data from individual hospitals,

    • producing patient-mix adjusted hospital-level estimates,

    • conducting research on the presentation of data for public reporting, and

    • publicly reporting the comparative hospital data.


Hospitals or their survey vendors are responsible for data collection, including: developing a sampling frame of relevant discharges, drawing the sample of discharges to be surveyed, collecting survey data from sampled discharges, and submitting HCAHPS data to CMS in a standard format. We have formatted the data files so hospitals/vendors will submit to CMS de-identified data files following 45 CFR Section §164.514. Hospitals maintain business associate agreements with their contracted survey vendors to collect and submit HCAHPS survey data through the secure QualityNet Exchange portal and data warehouse.


CMS began its collaboration with the Health Services Advisory Group (HSAG) in 2003 to coordinate the national implementation of the Hospital CAHPS Survey. HSAG’s role is to provide technical assistance and training for vendors and hospitals, data validation, data processing, analysis, and adjustment, and oversight of self-administering hospitals and survey vendors. HSAG also produces electronic data files and a hospital level extract file for public reporting of the HCAHPS scores.


In the spring of 2006, CMS conducted a large-scale experiment to assess the impact of mode of survey administration, patient characteristics and patient non-response on HCAHPS results. This first mode experiment was based on a nationwide random sample of short-term acute care hospitals. Hospitals from each of CMS’ ten geographic regions participated in in the Mode Experiment. A hospital's probability of being selected for the sample was proportional to its volume of discharges, which guaranteed that each patient would have an equal probability of being sampled for the experiment. The participating hospitals contributed patient discharges from a four-month period: February, March, April, and May 2006. Within each hospital, an equal number of patients was randomly assigned to each of the four modes of survey administration. Sample selection and surveying were conducted by the National Opinion Research Center of the University of Chicago, and the data was analyzed by the RAND Corporation.

A randomized mode experiment of 27,229 discharges from 45 hospitals was used to develop adjustments for the effects of survey mode (Mail Only, Telephone Only, Mixed mode, or Active Interactive Voice Response) on responses to the HCAHPS Survey. In general, patients randomized to the Telephone Only and Active Interactive Voice Response provided more positive evaluations than patients randomized to Mail Only and Mixed (Mail with Telephone follow-up) modes. These mode effects varied little by hospital, and were strongest for global items (rating and recommendation), and the Cleanliness & Quiet, Responsiveness, Pain Management, and Discharge Information composites. Adjustments for these mode effects are necessary to make the reported scores independent of the survey mode that was used. These adjustments are applied to HCAHPS results before they are publicly reported on the Hospital Compare website. The mode adjustments can be found in the “Mode Adjustment” section of the HCAHPS website, http://www.hcahpsonline.org.


The Mode Experiment also provided valuable information on the impact of salient patient characteristics and non-response bias on HCAHPS results. This analysis was needed because hospitals do not provide care for comparable groups of patients but, as demonstrated in the HCAHPS Three-State Pilot Study, some patient characteristics may affect measures of patient experiences of care. The goal of patient-mix adjustment, which is also known as case-mix adjustment, is to estimate how different hospitals would be rated if they provided care to comparable groups of patients. As suggested by the Three-State Pilot Study, a set of patient characteristics not under control of the hospital was selected for analysis. In summary, the most important patient-mix adjustment items were patients’ self-reported health status, education, service line (maternity, medical or surgical care) and age. In addition, after mode and patient-mix adjustments have been made, non-response effects were found to be negligible. A report on patient-mix and non- response adjustments, as well as the mode adjustments and current and past patient-mix adjustments, is available on the HCAHPS On-Line website, http://www.hcahpsonline.org/en/mode--patient-mix-adj/.

We also looked at the extent to which each domain contributes to measurement in priority areas established by an independent, expert body on quality measurement, the National Quality Forum (NQF). The HCAHPS domains “communication with doctors,” “communication with nurses,” “communication about medications” contribute to the NQF’s priority on improving care coordination and communication. The HCAHPS “discharge information” domain contributes to the priority on improving self-management and health literacy.


CMS, with assistance from HSAG and its sub-contractor, the National Committee on Quality

Assurance (NCQA), has developed and conducted annual training sessions for self-administering hospitals and survey vendors participating in the HCAHPS Survey, as well as others interested in this program. HCAHPS Introductory Training was first offered at the CMS headquarters in January 2006, and then by webinar in January and April 2006. Since then, HCAHPS Introductory Training has been held annually, by webinar. In addition, CMS developed the HCAHPS Update Training program. HCAHPS Update Training was first offered in May 2007 and has been offered annually by webinar since then.

HCAHPS Introductory Training is required for self-administering hospitals and survey vendors that wish to join HCAHPS. HCAHPS Update Training provides information on important changes to the HCAHPS program and is required for all self-administering hospitals and survey vendors participating in HCAHPS.


As part of an initiative to add five-star quality ratings to its Compare Web sites, CMS added HCAHPS Star Ratings to the Hospital Compare Web site in April 2015. Star ratings make it easier for consumers to use the information on the Compare Web sites and spotlight excellence in healthcare quality. Twelve HCAHPS Star Ratings appear on Hospital Compare: one for each of the 11 publicly reported HCAHPS measures, plus the HCAHPS Summary Star Rating. HCAHPS Star Ratings were the first star ratings to appear on Hospital Compare. CMS updates the HCAHPS Star Ratings each quarter. In support of the addition of HCAHPS Star Ratings on Hospital Compare, we created a special section of our HCAHPS On-Line Web site on which we post information and documents, as well as archives, including HCAHPS Star Rating Technical Notes that: explain the calculation of the star ratings; distribution of HCAHPS Star Ratings, HCAHPS Summary Star Rating distributions by state, slides from a national provider call on HCAHPS Star Ratings, and frequently asked questions; see http://www.hcahpsonline.org/en/hcahps-star- ratings/ .


  1. Names and telephone numbers of individuals consulted on statistical aspects of the survey and implementation design and names of agency units, contractor(s), grantee(s), or other persons(s) who collect and/or analyze the information for the agency.


Marc N. Elliott, PhD

Distinguished Chair in Statistics; Senior Principal Researcher RAND Corporation

(310) 393-0411, x7931

[email protected]


Christopher Cohea, MS Director, Research and Analysis

Health Services Advisory Group, Inc. (602) 471-3673

[email protected]

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleJustification of the Hospital CAHPS Survey
AuthorCMS
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy