CMS-10102 7-2-07 -- CMS-10102 Supporting Statement and Instrument(Survey)

CMS-10102 7-2-07 -- CMS-10102 Supporting Statement and Instrument(Survey).DOC

National Implementation of Hospital Consumer Assessment of Health Providers and Systems (HCAHPS)

OMB: 0938-0981

Document [doc]
Download: doc | pdf

Supporting Statement for the National Implementation of the Hospital CAHPS Survey


A 1.0 CIRCUMSTANCES OF INFORMATION COLLECTION


A 1.1 Background



The intent of the HCAHPS initiative is to provide a standardized survey instrument and data collection methodology for measuring patients’ perspectives on hospital care. While many hospitals currently collect information on patients’ satisfaction with care, there is no national standard for collecting or publicly reporting this information that would enable valid comparisons to be made across all hospitals. In order to make “apples to apples” comparisons to support consumer choice, it is necessary to introduce a standard measurement approach. HCAHPS can be viewed as a core set of questions that can be combined with a broader, customized set of hospital-specific items. HCAHPS is meant to complement the data hospitals currently collect to support improvements in internal customer services and quality related activities.


Hospitals began using HCAHPS, also known as Hospital CAHPS or the CAHPS Hospital Survey, under the auspices of the Hospital Quality Alliance, a private/public partnership that includes the American Hospital Association, the Federation of American Hospitals, and the Association of American Medical Colleges, Joint Commission on Accreditation of Healthcare Organizations, National Quality Forum, AARP, and CMS/AHRQ, and other stakeholders who share a common interest in reporting on hospital quality. This Alliance has been proactive in making performance data on hospitals accessible to the public thereby improving care. In March 2008, the first results from the HCAHPS survey will be publicly reported on the Hospital Compare website, which can be found at www.hospitalcompare.hhs.gov, or through a link on www.medicare.gov


CMS is sensitive to the costs that will be born by the hospitals that voluntarily participate in the HCAHPS initiative. In order to gain a full and detailed understanding of the range of costs associated with implementation of HCAHPS, CMS commissioned Abt Associates, Inc. to conduct a thorough investigation of the costs and benefits of HCAHPS.


Costs associated with collecting HCAHPS will vary depending on:

    • The method hospitals currently use to collect patient survey data;

    • The number of patients surveyed (target is 300 per year); and

    • Whether it is possible to incorporate HCAHPS into their existing survey.


Abt Associates’ cost estimates are for data collection and transmission to CMS only and do not include administrative, information technology, or other costs that hospitals may incur as a result of HCAHPS. Abt estimates that the average data collection cost for a hospital conducting the 27-item version of HCAHPS as a stand-alone survey would be between $3,300 - $4,575 per year, assuming that 80-85 percent of hospitals collect HCAHPS by mail and the remainder by phone or active IVR. The costs for combining HCAHPS with existing surveys would be considerably less. It would cost $978 per hospital annually to incorporate the 27-item version of HCAHPS into existing surveys.


Abt estimates that the nationwide data collection cost for conducting the 27-item version of HCAHPS would be between $4.1 million and $19.1 million per year if all eligible hospitals participated, depending upon the extent to which hospitals integrate HCAHPS with their existing survey activities or conduct it as a stand-alone survey. In the context of overall hospital costs, the maximum national cost estimate of HCAHPS represents less than 0.02 percent of overall hospital costs, even if all hospitals collect HCAHPS as a stand-alone survey.


While the potential benefits of HCAHPS cannot be enumerated as precisely as its costs, Abt concluded that:

What we can conclude with some level of confidence is that the marginal costs

associated with a longer version of HCAHPS are likely to be relatively small, so if

there is a reasonable basis for believing that the 27-item version of HCAHPS offers better information to consumers than a shorter alternative, then there are good reasons for implementing the current 27-item version of HCAHPS.”


The full report produced by Abt Associates, “Costs and Benefits of HCAHPS”, can be found at the following internet site: http://www.cms.hhs.gov/quality/hospital/ in section “Perspectives on Care HCAHPS”.


In spring of 2002, the Centers for Medicare & Medicaid Services (CMS), requested that the Agency for Healthcare Research and Quality (AHRQ), through the CAHPS team, develop and test a survey through which hospital patients could assess the care they receive. This process has been consistent with other CAHPS survey development processes, including: review of the relevant literature; review of existing hospital surveys obtained through a public call for measures; pilot testing; and a series of opportunities for public input.


Three broad goals have shaped HCAHPS. First, the survey is designed to produce comparable data on the patient’s perspective on care that allows objective and meaningful comparisons between hospitals on domains that are important to consumers. Second, public reporting of the survey results is designed to create incentives for hospitals to improve their quality of care. Third, public reporting will serve to enhance public accountability in health care by increasing the transparency of the quality of hospital care provided in return for the public investment. With these goals in mind, the HCAHPS project has taken substantial steps to assure that the survey will be credible, useful, and practical. This methodology and the information it generates will be made available to the public.


As mentioned previously, CMS has partnered with AHRQ to develop a standard instrument and data collection and reporting procedures that will capture patients’ perspectives of their hospital care. AHRQ is a leader in developing instruments for measuring patient perspectives on care. AHRQ and its grantees developed the Health Plans CAHPS Survey, which is currently used to assess the care provided by health plans covering over 132 million Americans. While CAHPS has been accepted as the industry standard for measuring consumers’ perspectives in the healthcare system, it does not address patients’ perspectives in the acute care setting. In response to this need, AHRQ initiated the process of developing a public domain survey instrument for the acute care setting. AHRQ has used its experience and expertise to work with CMS to develop both a standard survey for measuring patients’ reports and ratings of their care in the hospital setting, and approaches to reporting the results to consumers. Steps in creating and implementing the survey are summarized in Table 1.

Table 1: HOSPITAL CAHPS SURVEY® Development and Implementation



Timeframe

Published a “call for measures” in the Federal Register and received 7 submissions from Avatar, Edge Health Care Research Healthcare Financial Management Association, Jackson Organization, Press Ganey Associates, National Research Corporation, Peace Health, Professional Research Consultants, and SSM Health Care.

July 2002

Completed literature review.

Sept-Nov 2002

Held a Web chat to answer questions about HCAHPS.

Oct 2002

Provided draft domains to CMS.

Oct 2002

Reviewed measures submitted in response to Federal Register Notice (FRN).

Nov 2002

Held Stakeholders Meeting to solicit suggestions and comments.

Nov 2002

Held vendors meeting to solicit suggestions and comments.

Nov 2002

AHRQ delivered 66 item draft survey to CMS for use in pilot test.

Jan 2003

Continued cognitive testing, developed data collection and sampling methods, and developed analysis plan.

Jan-Feb 2003

Published FRN soliciting comments on draft HCAHPS.

Feb 2003

Completed hospital recruitment for pilot.

Mar 2003

Began data collection for CMS 3-state pilot test.

June 2003

Published a FRN soliciting comments on draft HCAHPS and asked for input about implementation issues.

June 2003

Analyzed data from CMS pilot test.

Sept–Nov 2003

Review of instrument by CAHPS Cultural Comparability team.

Fall 2004

Began CT pilot test of the HCAHPS instrument.

Fall 2003

Held HCAHPS Stakeholders’ Meeting at AHRQ.

Nov 2003

Revised HCAHPS instrument to 32 items.

Nov 2003

AHRQ submitted revised 32-items HCAHPS Instrument to CMS.

Dec 2003

Published a FRN soliciting input for 32-item HCAHPS instrument and implementation strategy.

Dec 2003-Feb 2004

Started coordination of national implementation with HSAG, the AZ QIO.

January 2004

Completed CT pilot test of HCAHPS.

Jan 2004

AHRQ submitted Analysis Report of the CMS 3-state pilot to CMS.

Jan 2004

Continued discussions with hospitals, vendors, consumers to follow-up on FRN comments from February.

March – Sept 2004

Revised 25-item HCAHPS Instrument submitted by AHRQ to CMS.

Oct 2004

Submitted HCAHPS to NQF for its consensus development process.

November 2004

Started developing training documents for national implementation.

December 2004

Started discussions regarding data transmission via QNET & other issues with the IFMC, the IA QIO.

April 2005

Formed the Data Integrity Group.

June 2005


Table 1: HOSPITAL CAHPS SURVEY® Development, cont.

Timeframe





Received endorsement for 27-item HCAHPS from the National Quality Forum.

May 2005

Modified survey instrument and protocol as 27-items.

May 2005

Abt Associates, Inc. receives OMB approval for cost-benefit analysis.

June 2005

Established the Data Submission and Reporting Group.

July 2005

Abt Associates, Inc. submits final report of the cost-benefit analysis.

October 2005

Published FRN soliciting comments on draft CAHPS Hospital Survey.

November 2005

Received final approval.

December 2005

Mode Experiment.

February – May 2006

HCAHPS training for self-administering hospital and survey vendors.

Jan. & April 2006;

Jan. & May 2007

Dry Run” of HCAHPS survey.

April-June 2006; Mar. 2007

National Implementation begins.

October 2006

HCAHPS participation linked to RHQDAPU program (“pay for reporting”).

July 2007


Begin site visits to survey vendors and self-administering hospitals for oversight.

August 2008

Next “Dry Run” of HCAHPS survey.

September 2007

Next opportunity for hospitals to join HCAHPS.

January 2008

First public reporting of HCAHPS results.

March 2008



Throughout the HCAHPS development process, CMS has solicited and received a great deal of public input. As a result, the HCAHPS questionnaire and methodology have gone through several iterations. The first was a 66-item version that was tested in a three-state pilot study (this was developed for testing purposes; we never intended that the final version be this long). Prior to the start of the pilot test, a Federal Register notice was published in February 2003 soliciting input on the proposed pilot study. This notice produced nearly 150 comments. Based on results of the pilot study, the questionnaire was reduced to 32 items. CMS received additional feedback from a Federal Register Notice published in June 2003 that sought further comment on the survey and implementation issues while the initial pilot testing was underway. CMS received 110 responses to the notice from hospital associations, provider groups, consumers/purchasers, and hospital survey vendors.


A 32-item version of the HCAHPS instrument was published in the Federal Register in December 2003 for public comment; CMS received nearly 600 comments that focused on the following topics: sampling, response rate, implementation procedures, cost issues, the length of the instrument, exclusion categories, question wording, and reporting.


CMS, AHRQ, the CAHPS grantees and members of the Instrument, Analysis, and Cultural Comparability teams met via conference calls two to three times per week to review all comments from the three Federal Register Notices and to modify the survey instrument and implementation strategy based on the comments. The input we received conflicted. Survey vendors and many hospitals indicated that the 32-item version was too long, while consumer groups indicated that the full content was needed to support consumer choice. After the comments were reviewed, CMS and AHRQ held additional discussions with hospitals, vendors and consumers to discuss the comments received.


Using the comments received from the public and stakeholders, and the psychometric analysis of the data from the 3-state pilot study, CMS reduced the next version of the HCAHPS questionnaire to 25 items. The following questions were eliminated from the longer versions of the questionnaire: overall rating of the doctor; overall rating of the nurse; how often did doctors treat you with courtesy and respect; how often did nurses treat you with courtesy and respect; overall mental health status; and two items related to whether and how a proxy helped complete the survey. The item regarding how often hospital staff ask if you were allergic to any medicine was also eliminated from the survey. A question from the original 66-item survey was used to replace the allergy question. This newly re-introduced item asks, “Before giving you the medicine, how often did hospital staff tell you what the medicine was for?” (Question 14 on the 25-item survey) in response to public input, we have also eliminated the reference to doctors in the screener question that identifies patients who needed help getting to the bathroom or using a bedpan (Question 8 on the 25-item survey).


The questions on overall mental health status and the two proxy questions were dropped because their impact in the patient-mix adjustment was negligible. Questions about being shown courtesy and respect by doctors and nurses were pulled from the latest version because input received indicated that the two items on doctors or nurses “explaining things fully” and “listening carefully” were synonymous with showing courtesy and respect. Taking these questions out of these composites did not adversely impact the psychometric properties of the doctor and nurse communication composites. The allergy question originally had a scale of “never to always’ that didn’t work well because the question is usually asked once upon admission. Changing the scale to a “yes/no” response provided very little variation across hospitals.


As the HCAHPS survey and implementation procedures evolved, CMS and AHRQ worked to develop procedures to allow for as much flexibility as possible to minimize disruption to current survey activities to the extent possible. For instance, prior to developing an implementation strategy CMS solicited input through the June 2003 Federal Register notice requesting feedback on mode of survey administration, periodicity of administration, and specific criteria for inclusion in the sampling frame. Using this feedback, CMS developed an implementation strategy that has been modified as a result of additional input received.


The basic implementation procedures were set up to provide flexibility to survey vendors and hospitals. For example, following training, approved hospitals or vendors can administer the Hospital CAHPS survey either as a (a) stand-alone survey or (b) integrated with the hospital’s existing survey. The survey will be conducted continuously throughout the year to make it easier to integrate with existing survey activities. Because vendors and hospitals currently use multiple modes to administer their internal hospital surveys, multiple survey modes are allowed. The hospital /vendors may use any of the following survey modes: telephone only, mail only, a mixed methodology of mail with telephone follow-up, or active interactive voice response (IVR). All modes of administration must follow a standardized protocol. Since different modes of administration may affect how patients respond, CMS will be conducting a large-scale mode experiment to determine appropriate adjustments to the data for public reporting.


In addition to the development and review processes outlined above, CMS submitted the 25-item version of the HCAHPS instrument to the National Quality Forum (NQF), a voluntary consensus standard-setting organization established to standardize healthcare quality measurement and reporting, for its review and endorsement. NQF endorsement represents the consensus of numerous healthcare providers, consumer groups, professional associations, purchasers, federal agencies, and research and quality organizations. Following a thorough, multi-stage review process, HCAHPS was endorsed by the NQF board in May 2005. In the process, NQF recommended a few modifications to the instrument. As a result of the recommendation of the National Quality Forum Consensus Development Process, the two courtesy and respect items were added back into the survey. The review committee felt that these questions are important to all patients, but may be particularly meaningful to patients who are members of racial and ethnic minority groups. The two reinstated items are: “During this hospital stay, how often did nurses treat you with courtesy and respect?”, and “During this hospital stay, how often did doctors treat you with courtesy and respect?


Another recommendation from the NQF was to expand the response categories for the ethnicity question in the “About You” section as follows:

  • No, not Spanish/Hispanic/Latino

  • Yes, Puerto Rican

  • Yes, Mexican America, Chicano

  • Yes, Cuban

  • Yes, other Spanish/Hispanic/Latino


Acting on another the recommendation of the National Quality Forum, CMS further examined the costs and benefits of HCAHPS. This cost-benefit analysis of HCAHPS was independently conducted by Abt Associates, Inc., and completed on October 5, 2005. The Executive Summary and Final Report cost-benefit analysis, Costs and Benefits of HCAHPS”, may be found at the following internet site: http://www.cms.hhs.gov/quality/hospital/ in the section called “Perspectives on Care CAHPS.”


The accumulated lessons learned from the pilot testing, public comments, input from stakeholders, numerous team discussions, and the National Quality Forum’s review and endorsement through their consensus development process led to the latest version of the HCAHPS survey which has 27 items and the HCAHPS data collection protocol that allows hospitals to integrate their own specialized questions. The resulting core questionnaire is comprised of questions in several dimensions of primary importance to the target audience: doctor communication, responsiveness of nurses, quiet and cleanliness of the physical environment, nurse communication, pain control, communication about medicines, and discharge information. The 27-item HCAHPS survey that was formally endorsed by the NQF may be found in Appendix A.


The HCAHPS implementation plan, in particular, has been changed significantly as a result of the public input we have received. CMS has made the following major changes in the implementation approach:

  • reduced the number of mailings for the “mail only” survey protocol from three to two; reduced the number of follow-up phone calls for the “telephone only” survey protocol from ten to five;


  • added active interactive voice response (IVR) as a mode of survey administration;

  • eliminated the cap on the number of hospital/vendor questions added to the HCAHPS items;


  • eliminated the 50% response rate requirement;

  • reduced the number of patient discharges to be surveyed; and,


There are distinct roles for hospitals, or their survey vendors, and the federal government in the national implementation of HCAHPS. The government is responsible for:

  • support and public reporting, including:

  • conducting training on data collection and submission procedures;

  • providing on-going technical assistance;

  • ensuring the integrity of data collection;

  • accumulating HCAHPS data from individual hospitals;

  • producing patient-mix-adjusted hospital-level estimates;

  • conducting research on the presentation of data for public reporting; and,

  • publicly reporting the comparative hospital data.


Hospitals or their survey vendors are responsible for data collection, including: developing a sampling frame of relevant discharges, drawing the sample of discharges to be surveyed, collecting survey data from sampled discharges, and submitting HCAHPS data to CMS in a standard format. We have formatted the data files so hospitals/vendors will submit to CMS de-identified data files following 45 CFR Section §164.514. As they currently do, hospitals will maintain business associate agreements with their survey vendors. Hospitals securely submit their HCAHPS data to CMS through QualityNet Exchange.


CMS started collaboration with the Health Services Advisory Group (HSAG) in 2003 to coordinate the national implementation of the Hospital CAHPS Survey. HSAG’s role will is to provide technical assistance and training for vendors and hospitals, data validation, data processing, analysis, and adjustment, and oversight of self-administering hospitals and survey vendors. HSAG will also produce electronic data files and a hospital level extract file for public reporting of the HCAHPS scores.


In the spring of 2006 CMS conducted a large-scale experiment to assess the impact of mode of survey administration, patient characteristics and patient non-response on HCAHPS results. This Mode Experiment was based on a nationwide random sample of short-term acute care hospitals. Hospitals from each of CMS' ten geographic regions participated in the Mode Experiment. A hospital's probability of being selected for the sample was proportional to its volume of discharges, which guaranteed that each patient would have an equal probability of being sampled for the experiment. The participating hospitals contributed patient discharges from a four-month period: February, March, April, and May 2006. Within each hospital, an equal number of patients was randomly assigned to each of the four modes of survey administration. Sample selection and surveying were conducted by the National Opinion Research Center of the University of Chicago, and the data was analyzed by the RAND Corporation.


A randomized mode experiment of 27,229 discharges from 45 hospitals was used to develop adjustments for the effects of survey mode (Mail Only, Telephone Only, Mixed mode, or Active Interactive Voice Response) on responses to the HCAHPS survey. In general, patients randomized to the Telephone Only and Active Interactive Voice Response provided more positive evaluations than patients randomized to Mail Only and Mixed (Mail with Telephone follow-up) modes. These mode effects varied little by hospital, and were strongest for global items (rating and recommendation), and the Cleanliness & Quiet, Responsiveness, Pain Management, and Discharge Information composites. Adjustments for these mode effects are necessary to make the reported scores independent of the survey mode that was used. These adjustments will be applied to HCAHPS results before they are publicly reported on the Hospital Compare website. The mode adjustments can be found on in the “Mode Adjustment” section of the HCAHPS website, www.hcahpsonline.org. The algorithm for the patient-mix adjustment can be found in Appendix B.


The Mode Experiment also provided valuable information on the impact of salient patient characteristics and non-response bias on HCAHPS results. This analysis was needed because hospitals do not provide care for comparable groups of patients but, as demonstrated in the HCAHPS Three-State Pilot Study, some patient characteristics may affect measures of patient experiences of care. The goal of patient-mix adjustment, which is also known as case-mix adjustment, is to estimate how different hospitals would be rated if they provided care to comparable groups of patients. As suggested by the Three-State Pilot Study, a set of patient characteristics not under control of the hospital were selected for analysis. In summary, the most important patient-mix adjustment items were patients’ self-reported health status, education, service line (maternity, medical or surgical care) and age. Patient-mix effects were generally less potent than survey mode. In addition, after mode and patient-mix adjustments have been made, non-response effects were found to be negligible. A report on patient-mix and non-response adjustments is being finalized and will be made available soon on www.hcahpsonline.org. The algorithm for the patient-mix adjustment can be found in Appendix B.


CMS, with assistance from HSAG and its sub-contractor, the National Committee on Quality Assurance (NCQA), has developed and conducted a series of training programs for self-administering hospitals and survey vendors participating in the HCAHPS survey, as well as others interested in this program. HCAHPS Introductory Training sessions were conducted at CMS headquarters in January 2006, as well as by webinar in January and April 2006, and January 2007. In addition, HCAHPS Update Training was conducted by webinar in May 2007. We anticipate that HCAHPS Introductory training, which is necessary for self-administering hospitals and survey vendors that wish to join HCAHPS, will be conducted on an annual basis, while HCAHPS Update Training, which provides information on important changes to the HCAHPS program for all participating self-administering hospitals and survey vendors, will be offered as needed.


In addition to attending HCAHPS Introductory Training, self-administering hospitals and survey vendors are required to conduct a “dry run” of the HCAHPS survey before they can participate. The purpose of the dry run is for self-administering hospitals and survey vendors to gain experience in every phase of the survey process and its protocols, including data submission to the HCAHPS data warehouse. Dry runs have been conducted in Spring 2006 and March 2007, and the next dry run is scheduled for September 2007. Using the official survey instrument and the approved modes of implementation and data collection protocols, self-administering hospitals and survey vendors collect HCAHPS data for eligible patients and submit it to CMS via QualityNet Exchange. In addition to providing real experience, participation in the dry run produces HCAHPS data that the hospital can use for self-assessment. Data collected for the dry run is not publicly reported.


The dry runs have provided additional benefits to CMS by providing a test of its HCAHPS-related systems and a check on the usability of the HCAHPS protocols. Based upon hospitals’ experience in the first dry run, CMS made several improvements to the data coding and submission processes, as well as deciding to exclude several small categories of patients from eligibility for logistical reasons. These patient categories are: patients who were prisoners, who were discharged to hospice care, who had a foreign home address, or who upon admission had requested that the hospital not survey them.


Voluntary collection of HCAHPS data for public reporting began in October 2006. The first

public reporting of HCAHPS results, which will encompass eligible discharges from October 2006 through June 2007, is slated for March 2008. HCAHPS results will be posted on the Hospital Compare website, found at www.hospitalcompare.hhs.gov, or through a link on www.medicare.gov. CMS is currently conducting consumer testing of the website display of HCAHPS results, as well as gathering input from key stakeholder groups. The website displays will be finalized in Fall 2007. The next opportunity for hospitals to join HCAHPS will be January 2008.


Beginning in July 2007 there will be an additional incentive for eligible hospitals to participate in HCAHPS. Hospitals subject to IPPS payment provisions ("subsection (d) hospitals") must collect and submit HCAHPS data in order to receive their full IPPS annual payment update (APU) for fiscal year 2008. IPPS hospitals that fail to report the required quality measures, which include the HCAHPS survey, may receive an APU that is reduced by 2.0 percentage points. Hospitals must submit a Pledge form in Summer 2007 stating their intention to participate from July 2007 forward. Non-IPPS hospitals, such as Critical Access Hospitals, can voluntarily participate in HCAHPS.


HCAHPS includes two types of results. The first is global ratings, which measure respondents’ assessment of their hospital using a scale from 0 to 10, and a variable measuring whether the respondents would recommend the hospital to a family member or friend. The second is composites, which combine results for closely related items from the same domain. Table 2 lists the questions for each of the two global ratings and seven composites used to report results from HCAHPS.


We also looked at the extent to which each domain contributes to measurement in priority areas established by an independent, expert body on quality measurement, the National Quality Forum (NQF). The HCAHPS domains “communication with doctors”, “communication with nurses”, “communication about medications” will contribute to the NQF’s priority on improving care coordination and communication. The HCAHPS “pain control” domain will contribute to the NQF’s pain management priority, while the HCAHPS “discharge information” domain will contribute to the priority on improving self-management and health literacy.



Table 2. HOSPITAL CAHPS SURVEY Global Ratings and Reporting Composites















Global Ratings

 

 

 

 

Response Format

 

 

Overall Rating of Hospital








Q21

Using any number from 0 to 10, where 0 is the worst

0-10 scale




hospital possible and 10 is the best hospital possible,






what number would you use to rate this hospital





Recommendation









Q22

Would you recommend this hospital to your friends and family?









definitely no, probably no, probably yes, definitely yes




Composites and Items

 

 

 

Response Format

 

 

Communication with Nurses








Q1

During this hospital stay, how often did nurses treat you with courtesy and respect?

never, sometimes, usually, always

Q2

During this hospital stay, how often did nurses listen

never, sometimes, usually, always


carefully to you?








Q3

During this hospital stay, how often did nurses explain

never, sometimes, usually, always


things in a way you could understand?






Communication with Doctors








Q5

During this hospital stay, how often did doctors treat you with courtesy and respect?

never, sometimes, usually, always

Q6

During this hospital stay, how often did doctors listen

never, sometimes, usually, always


carefully to you?








Q7

During this hospital stay, how often did doctors explain



things in a way you could understand?


never, sometimes, usually, always




Physical environment (cleanliness and quiet)






Q8

During this hospital stay, how often were your room

never, sometimes, usually, always


and bathroom kept clean?







Q9

During this hospital stay, how often was the area

never, sometimes, usually, always


around your room quiet at night?







Responsiveness of Hospital Staff









Q4


During this hospital stay, after you pressed the call button



how often did you get help as soon as you wanted it?

never, sometimes, usually, always




Q11

How often did you get help in getting to the bathroom or in



using a bedpan as son as you wanted?


never, sometimes, usually, always




Pain Control









Q13

During this hospital stay, how often was your pain well

never, sometimes, usually, always


controlled?








Q14

During this hospital stay, how often did hospital staff

do everything they could to help you with your pain?



never, sometimes, usually, always




Communication about Medications







Q16

Before giving you any new medicine, how often did

never, sometimes, usually, always


hospital staff tell you what the medicine was for?





Q17

Before giving you any new medicine, how often did

never, sometimes, usually, always


hospital staff describe possible side effects in a way






you could understand?







Discharge Information








Q19

During this hospital stay, did hospital staff talk with you





about whether you would have the help you needed

yes, no





when you left the hospital?







Q20

During your hospital stay, did you get information in

yes, no





Writing about what symptoms or health problems to look






out for after you left the hospital?






In addition to the questions that are used to form the HCAHPS measures of care from the patient’s perspective, there are also four screener items (not listed in Table 2). These items are included to ensure that only appropriate respondents answer the questions that follow. For example, there is a screener item that asks whether during the hospital stay you were given any medicine that you had not taken before. Only respondents who answer “yes” are directed to answer the two following questions about how often hospital staff explained what the medicine was for and possible side effects. We considered dropping the screener questions; however, CAHPS testing has shown without screeners respondents are more likely to give an inappropriate response rather than checking off “not applicable.” It was also thought that this potential error would be further exacerbated by having different modes of administration.


The psychometric performance of the domain-level composites with respect to reliability and construct validity is summarized in Table 3. The basic unit of reporting for HCAHPS measure is the hospital. Thus, we focus on the psychometric performance of the composites at the hospital level. Hospital-level reliability reflects the extent to which variation in scores on a composite reflects variation between hospitals, as opposed to random variation in patient response within hospitals. We want measures whose variation reflects differences in performance between hospitals. The hospital-level reliability reported in Table 3 is calculated as [1 – (1/F)], where F is the F-statistic for testing for differences among hospitals. The numerator of the F-statistic summarizes the amount of variation among the means for different hospitals on the measure in question. The denominator summarizes the amount of random variation expected in these means due to sampling of patients. If there were no real differences among hospitals, such that all of the differences were due to random variations in the reports of patients who happened to answer the survey, the hospital-level reliability would be 0.0. The more real differences among hospitals, relative to random variation, the larger the hospital-level reliability is expected to be (up to a maximum of 1.0). Note that as we increase sample sizes, the measures become more precise, so the amount of random variation becomes smaller and the hospital-level reliability becomes larger.


There is no fixed level that reliability needs to reach for a measure to be useful, but 0.7 is a commonly used rule-of-thumb. The hospital-level reliabilities of communication with doctors (0.76), communication with nurses (0.89), responsiveness of hospital staff (0.81), cleanliness and quiet of the physical environment (0.77), and discharge information (0.75) are all above the rule-of-thumb. Pain control (0.62) and communication about medicines (0.68) are a little lower.


Construct validity represents the extent to which a measure relates to other measures in the way expected. If the proposed domains are important factors in quality for consumer choice, we would expect that hospitals with high scores on the composites would also have high scores on patient willingness to recommend the hospital and the overall rating. That is, the composites should be positively correlated at the hospital level with the willingness to recommend the hospital and the overall rating of the hospital. Again, there is no fixed level that these correlations need to reach for the composite to be useful, but 0.4 is a reasonable rule-of-thumb. The hospital-level correlations between the composites and willingness to recommend the hospital were all above this level. The correlation was 0.54 for communication with doctors, 0.76 for communication with nurses, 0.70 for responsiveness of hospital staff, 0.68 for cleanliness and quiet of the physical environment, 0.72 for pain control, 0.73 for communication about medicines, and 0.53 for discharge information. There was a similar pattern for the correlations of the composites with the overall hospital rating. Values ranged from 0.81 for communication with nurses, to 0.57 with discharge information.

TABLE 3

HOSPITAL CAHPS SURVEY DOMAIN-LEVEL COMPOSITES,

INDICATORS OF PSYCHOMETRIC PERFORMANCE







Psychometric performance









Construct validity




Hospital-level





correlation with

Hospital-level


Hospital-level


willingness to

correlation with

Domain-level composite

reliability

 

Recommend

overall rating






Communication with doctors

0.76


0.54

0.59

Communication with nurses

0.89


0.76

0.81

Responsiveness of hospital staff

0.81


0.70

0.75

Cleanliness and quiet of environ.

0.77


0.68

0.75

Pain control

0.62


0.72

0.76

Communication about meds.

0.68


0.73

0.65

Discharge information

0.75


0.53

0.57

 

 

 

 

 

Note: Data from 3-state pilot study (130 hospitals, 19,683 discharges).









It should be noted, though, that the composites themselves are correlated, some highly so (see Table 4). One question is the extent to which each domain-level composite has an independent effect on willingness to recommend the hospital and overall rating of the hospital. To examine this, AHRQ ran regressions of willingness to recommend the hospital and overall rating of the hospital with the seven composites.


TABLE 4

HOSPITAL-LEVEL CORRELATIONS AMONG DOMAIN COMPOSITES





















Cleanliness





Commun.

Commun.

Respons.

and quiet

Pain

Commun.

Discharge

 

with doctors

with nurses

of staff

of envir.

control

about meds

information









Communication with doctors

1.00







Communication with nurses

0.55

1.00






Responsiveness of hospital staff

0.62

0.91

1.00





Cleanliness and quiet of envir.

0.39

0.64

0.64

1.00




Pain control

0.82

0.82

0.85

0.59

1.00



Communication about meds.

0.68

0.79

0.83

0.64

0.81

1.00


Discharge information

0.80

0.65

0.73

0.39

0.76

0.69

1.00

 

 

 

 

 

 

 

 

Data from 3-state pilot study (130 hospitals, 19,683 discharges).








The results of these regressions are shown in Table 5. Because of the high collinearity among the composites, only communication with nurses and pain control had statistically significant effects in the equation for willingness to recommend the hospital. Communication with nurses, pain control, and communication about medicines had statistically significant effects in the equation for the overall rating of the hospital.



TABLE 5

HOSPITAL-LEVEL REGRESSIONS USING DOMAIN COMPOSITES


















Willingness to recommend hospital


Overall rating of hospital










Parameter

Standard



Parameter

Standard


Domain-level composite

estimate

error

p-value

 

estimate

error

p-value









Communication with doctors

-0.296

0.160

0.067


-0.516

0.315

0.103

Communication with nurses

0.627

0.197

0.001


1.559

0.389

0.000

Responsiveness of hospital staff

0.046

0.120

0.706


0.138

0.237

0.562

Cleanliness and quiet of envir.

0.073

0.103

0.480


0.312

0.204

0.128

Pain control

0.536

0.160

0.001


1.112

0.316

0.001

Communication about meds.

0.114

0.086

0.189


0.403

0.170

0.019

Discharge information

0.126

0.213

0.555


0.485

0.420

0.251

 

 

 

 

 

 

 

 

R-square=0.669 for willingness to recommend hospital.






R-square=0.787 for overall rating of hospital.







Data from 3-state pilot study (130 hospitals, 19,683 discharges).






To further examine this issue, AHRQ conducted a hospital-level factor analysis just with items from the reduced version of the questionnaire. This analysis extracted three factors. A factor that might be called “nursing services” was formed that combined the communication with nurses, responsiveness of hospital staff, and communication about medicines composites. A factor that might be labeled “physician care” was formed that combined the communication with doctors, pain control, and discharge information composites. The third factor was the same as the current cleanliness and quiet of the hospital environment composite. AHRQ then ran regressions of willingness to recommend the hospital and overall rating of the hospital with these three factors. The results are presented in Table 6. The nursing services and physician care factors were strong predictors in both the equation for willingness to recommend the hospital and the equation for overall rating of the hospital. The effect of cleanliness and quiet of the environment was statistically significant in the overall rating equation.


TABLE 6

HOSPITAL-LEVEL REGRESSIONS USING COMBINED FACTORS


















Willingness to recommend hospital


Overall rating of hospital










Parameter

Standard



Parameter

Standard


Factor

estimate

error

p-value

 

estimate

error

p-value









Nursing services a/

0.574

0.146

0.000


1.613

0.291

0.000

Physician care b/

0.709

0.247

0.005


1.761

0.492

0.001

Cleanliness and quiet of environment

0.142

0.103

0.172


0.426

0.205

0.040

 

 

 

 

 

 

 

 

R-square=0.619 for willingness to recommend hospital.






R-square=0.751 for overall rating of hospital.







Data from 3-state pilot study (130 hospitals, 19,683 discharges).





a/ Combines communication with nurses, responsiveness of hospital staff, and communication about medicines.

b/ Combines communication with doctors, pain control, discharge information.





Establishing domains that are important for public reporting is based on the psychometric characteristics of the measures and the utility of the information from perspective of consumers. The input we have received suggests that the seven composites are valuable. Thus, the original composite structure was maintained to best support consumer choice.


Another set of items is included on the survey for patient-mix adjustment and other analytic purposes. The goal of HCAHPS is to collect information from patients using the HCAHPS survey and to present the information based on those surveys to consumers, providers and hospitals. One of the methodological issues associated with making comparisons between hospitals is the need to adjust appropriately for patient-mix differences. Patient-mix refers to patient characteristics that are not under the control of the hospital that may affect measures of patient experiences, such as demographic characteristics and health status. The basic goal of adjusting for patient-mix is to estimate how different hospitals would be rated if they all provided care to comparable groups of patients.


As discussed earlier, there will be an adjustment for the hospital reports to control for patient characteristics that affect ratings and are differentially distributed across hospitals. Most of the patient-mix items are included in the “About You” section of the instrument, while others are taken from administrative records. Based on the mode experiment, and consistent with previous studies of patient-mix adjustment in CAHPS and in previous hospital patient surveys, we will be using the following variables in the patient-mix adjustment model:

  • Self-reported general health status (specified as a linear variable)

  • Education (specified as a linear variable)

  • Type of service (medical, surgical, or maternity care)

  • Age (specified as a categorical variable)

  • Admission through emergency room

  • Lag time between discharge and survey

  • Age by service line interaction

  • Language other than English spoken at home


Once the data are adjusted for patient-mix, there will be a fixed adjustment for each of the reported measures for mode of administration (discussed in detail below). The patient-mix adjustment will use a regression methodology also referred to as covariance adjustment.


On the survey, there are two additional questions to capture the race and ethnicity of the respondent. These are not included in the patient-mix adjustment model but included as analytic variables to support the congressionally mandated “National Healthcare Disparities Report” and “National Healthcare Quality Report.” These reports will provide annual, national-level breakdowns of HCAHPS scores by race and ethnicity. Many hospitals collect information on race and ethnicity through their administrative systems, but coding is not standard. Thus it was determined that administrative data are not adequate to support the analyses needed for the reports and the items should be included in the questionnaire.


A 1.2 Survey Approach

The HCAHPS survey will be administered in English and Spanish to a random sample of adult patients, with at least one overnight stay, discharged from an acute care hospital. Psychiatric and pediatric patients are excluded from the survey, as well as patients who were prisoners, who were discharged to hospice care, who had a foreign home address, or who at admission requested the hospital not to survey them (“no publicity” patients). In addition, self-administering hospitals and survey vendors have a new option of offering the mail version of the HCAHPS survey in Chinese, though using this instrument requires permission from CMS.


CMS will require that, following training, approved hospitals or vendors administer the

HCAHPS survey either as (a) a stand-alone survey or (b) integrated with the hospital’s existing survey. If the survey is integrated with an existing survey, the HCAHPS two global ratings and the items that constitute the seven domains will be placed at the beginning of the questionnaire. Participating hospitals may place items from their current survey after the

HCAHPS core items (questions 1-22). HCAHPS demographic items (questions 23-27) may be placed anywhere in the questionnaire after the core items. We suggest that the hospital/vendor use transitional phrasing such as the following to transition from the HCAHPS items to the hospital-specific ones:

Now we would like to gather some additional detail on topics we have asked you about before. These items use a somewhat different way of asking for your response since they are getting at a little different way of thinking about the topics”



Flexibility in the mode of administering the survey will be permitted. The hospitals/vendors may use any of the following modes: telephone only, mail only, a mixed methodology of mail with telephone follow-up, or active interactive voice response (IVR). All modes of administration will require following a standardized protocol. Quality assurance guidelines have been developed for each mode of survey administration detailing issues related to protocol, handling of the questionnaires and other materials, training, interviewing systems, attempts, monitoring and oversight.


Modes of Survey Administration


Mail Only

For the mail only option, the hospital/vendor is required to send the HCAHPS questionnaire, alone or combined with hospital-specific questions, along with a cover letter, between 48 hours and 6 weeks following discharge. CMS provides sample cover letters to hospitals/vendors in its training program for HCAHPS. The hospitals/vendors may tailor their letters, but the letters must contain information about the purpose of the survey, and that participation in the survey is voluntary and will not affect their patients’ health care benefits.


The hospital/vendor must send a second questionnaire with a reminder/thank you letter to those not responding approximately 21 days after the first mailing. Data collection would be closed out for a particular respondent within 21 days following the mailing of the second questionnaire.


Telephone Only

For the telephone only option, the hospital/vendor is required to begin data collection between 48 hours and six weeks following discharge. The hospital/vendor must attempt to contact respondents up to 5 times unless the respondent explicitly refuses to complete the survey. These attempts must be made on different days of the week and different times of the day and in different weeks to ensure that as many respondents are reached as feasible. Data collection is closed out for a particular respondent 42 days following the first telephone attempt.


For the HCAHPS portion of the survey, CMS is providing scripts to follow for the telephone interviewing in both English and Spanish. The interviewers conducting the survey must be trained before beginning interviewing. In its training program for HCAHPS CMS provides guidance on how to train interviewers to conduct HCAHPS. The training program emphasizes that interviewers read questions as worded, use non-directive probes, maintain a neutral and professional relationship with the respondent, and that interviewers record only the answers that the respondents themselves choose.


Mixed Mode

A third option is a combination of mail and telephone. In this mixed mode of administration, there is one wave of mailing (cover letter and questionnaire) and up to five telephone call-back attempts for non-respondents. The first survey is sent out between 48 hours and 6 weeks following discharge. Telephone follow-up is initiated for all non-respondents approximately 21 days after the initial questionnaire mailing. The telephone attempts must be made on different days of the week and different times of the day, and in different weeks to ensure that as many respondents are reached as feasible. Telephone interviewing must end 6 weeks after the first survey mailing. Similar to the telephone only mode, CMS provides telephone scripts for the hospitals/vendors to follow.



Active IVR

For active IVR, hospitals/vendors must initiate data collection by phone between 48 hours to six weeks following discharge. A live interviewer asks the respondent if she/he is willing to complete the survey using the IVR system. Through the IVR system respondents completes the survey using the touch-tone keypad on their phone. The hospital/vendor is required to provide an option for the respondent to opt out of the system and return to a live interviewer.


Similar to the telephone mode, the hospital/vendor must call each respondent up to 5 times unless the respondent refuses to complete the survey. These attempts must be made on different days of the week and different times of the day, and in different weeks to ensure that as many respondents are reached as feasible. Data collection must be closed out for a particular respondent 42 days following the first telephone attempt.


Sampling

A description of sampling for HCAHPS follows, including the basic structure of sampling, population and sampling frame, and sampling approach.


Basic structure of sampling


We received public input regarding sampling. The majority of respondents preferred to sample discharges on a more continuous basis (i.e., a monthly basis) and cumulate these samples to create rolling estimates based on 12-months of data. We chose to pursue the more continuous sampling approach for the following reasons:


  • It is more easily integrated with many hospitals’ existing survey processes used for internal improvement.


  • Improvements in hospital care can be more quickly reflected in hospital scores (e.g., 12-month estimates could be updated on a quarterly or semi-annual basis).


  • Hospital scores are less susceptible to unique events that could affect hospital performance at a specific point in time (e.g., a temporary shortage of nursing staff that could adversely affect the hospital’s score if the survey was done only once during the year at the same time as the shortage).


  • It is less susceptible to gaming (e.g., hospitals being on their best behavior just around the survey).


  • By sampling and collecting data on a monthly basis, it is not necessary to reach so far back in time in order to obtain a sufficient number of discharges to meet the sample size requirements.


For these reasons, the basic structure of sampling for HCAHPS entails drawing a sample of relevant discharges on a monthly basis. Data will be collected from patients in each monthly sample and will be cumulated to create a rolling 12-month data file for each hospital. Hospital-level scores for the HCAHPS measure will be produced using 12 months of data. After the initial 12 months of data collection, the hospital level scores will be updated on a quarterly basis utilizing the most recent 12 months worth of data.


Population and sampling frame


HCAHPS is designed to collect data on care from the patient’s perspective for general acute care hospitals. Pediatric, psychiatric, and other specialty hospitals are not included (additional/different aspects of care need to be examined for these specialty settings). Within general acute care hospitals, HCAHPS scores are designed to reflect the care received by patients of all payers (not just Medicare patients). Specifically, this includes the population of non-psychiatric and non-pediatric patients who had an overnight stay in the hospital and were alive when discharged. Patients who did not have an overnight stay are excluded because we don’t want to include patients who had very limited inpatient interaction in the hospital (e.g., patients who were admitted for a short period for purely observational purposes; patients getting only outpatient care are not included in HCAHPS). Patients who died in the hospital are excluded because HCAHPS is designed to capture information from the perspective of the patient himself or herself, thus proxy respondents are not permitted.


Following the first dry run of HCAHPS, for logistical reasons we decided to exclude several additional categories of patients from eligibility: those who were prisoners, who were discharged to hospice care, who had a foreign home address, or who at admission had requested the hospital not to survey them (“no publicity” patients).


CMS designed the HCAHPS survey with the intention of capturing the views of the broadest sample of patients discharged from short-term, acute care hospitals. Therefore, categorical exclusions of patients from the HCAHPS survey are few and are based on CMS policy decisions. Patients will be excluded only when the survey does not properly apply to them (pediatric patients below age 18, and psychiatric patients), or when they have died in hospital or prior to being surveyed. CMS may revisit the issue of exclusion categories as it gains experience administering HCAHPS. In addition, future versions of HCAHPS may be developed for some patients who are currently excluded, i.e., pediatric patients.


While a wide spectrum of patients will be eligible for participation in HCAHPS, personal identities will not be asked for or revealed. Eligible discharged patients will be randomly surveyed by hospitals or their designated vendor and we have designed the data files so all protected health information (PHI) will be de-identified {See 45 CFR §164.514 (de-identification of PHI)} before transmission to CMS. Thus, a patient’s personal identity will not be transmitted to, revealed to or used by CMS. It is our intent that hospitals will submit a thoroughly de-identified data set through the Quality Net Exchange. The data will be analyzed by the Health Services Advisory Group, a quality improvement organization. Hospital-level data will then be transmitted to CMS for public reporting.


The monthly sampling frame that defines the population for a given hospital includes all discharges between the first and last days of the month, with the exclusions noted above.

CMS will periodically review the patient sampling process employed by participating hospitals to assure that patients are being properly included or excluded, and that patients are being surveyed at random. Some states have further restrictions on patients who may be contacted. Hospitals/vendors should exclude other patients as required by law or regulation in the state in which they operate.


Sampling approach

Hospitals and survey vendors have several options for the monthly sampling of eligible discharges, including: a simple random sample (which includes a census of all eligible discharges); a proportionate stratified random sample (PSRS); or a disproportionate stratified random sample (DSRS). The simple random sample is considered normative. Hospitals/survey vendors must receive an exception in order to use a disproportionate stratified random sample, as well as provide information about the name, number of eligible discharges, and number of sampled discharges of each stratum.


In the simple random sample approach, the hospital/vendor will draw a simple random sample each month from the sampling frame of eligible discharges. Sampling can be done at one time after the end of the month or continuously throughout the month as long as a simple random sample is generated for the month (the hospital/vendor can chose what works best with their current survey activities for internal improvement).


As noted above, monthly data will be cumulated to produce a rolling 12-month data file for each hospital that will be used to produce hospital-level scores for the HCAHPS measure. A target for the statistical precision of these scores is based on reliability criterion. Higher reliability means higher “signal to noise” in the data. The target is that the reliability for the global ratings and most composites be 0.8 or more. Based on this, it will be necessary for each hospital to obtain 300 completed HCAHPS questionnaires over a 12 month period. Small hospitals that are unable to reach the target of 300 completes in a given 12-month period should sample all eligible discharges and attempt to obtain as many completes as possible.


In order to calculate the number of discharges that need to be sampled each month to reach this target, it is necessary to take into account the proportion of sampled patients expected to complete the survey (not all sampled patients who are contacted to complete the survey will actually complete it; the target is 300 completes). This is a function of the proportion of sampled patients who turn out to be ineligible for the survey and the survey response rate among eligible respondents. The calculation of the sample size needed each month can be summarized as follows:



Step 1 – Identify the number of completes needed over 12 months

C = number of completes needed = 300


Step 2 – Estimate the proportion of sampled patients expected to complete the survey

Let:

I = expected proportion of ineligible sampled patients

R = expected survey response rate among eligible respondents

P = proportion of sampled patients expected to complete the survey

= (1 – I) x R


Step 3 – Calculate the number of discharges to sample


N12 = sample size needed over 12 months = C / P

N1 = Sample size needed each month = N12 / 12



Some small hospitals will not be able to reach the target of 300 completes in a given 12-month period. In such cases, the hospital should sample all discharges and attempt to obtain as many completes as possible. All hospital results will be publicly reported. However, the lower precision of the scores derived from fewer than 300 completed surveys will be noted in the public reporting. CMS will also publicly report the response rate achieved by each hospital participating in HCAHPS and, roughly, its number of completed surveys (“less than 100”, “100 to 299”, and “300 or more”).


B 1.0 Need and Legal Basis


HCAHPS is part of the Hospital Quality Alliance, a private/public partnership that includes the American Hospital Association, the Federation of American Hospitals, and the Association of American Medical Colleges, Joint Commission on Accreditation of Healthcare Organizations, National Quality Forum, AARP, and CMS/AHRQ. Working together, the group wants to 1) provide useful and valid information about hospital quality to the public; 2) provide hospitals a sense of predictability about public reporting expectations; 3) begin to standardize data and data collection mechanisms; and 4) foster hospital quality improvement. Clinical information is available on the Internet at www.cms.gov since the summer of 2003 and was added to www.hospitalcompare.hhs.gov in April 2005. HCAHPS results will be added to this website beginning in March 2008.


In addition, beginning with fiscal year 2008 participation in HCAHPS will become part of the Reporting Hospital Quality Data Annual Payment Update (RHQDAPU) program. Hospitals that are subject to the inpatient prospective payment system (IPPS) provisions (RHQDAPU-eligible "subsection (d) hospitals") must meet the new reporting requirements in order to receive their full IPPS annual payment update (APU). Hospitals that do not participate in the RHQDAPU initiative, which includes the new HCAHPS reporting requirements, could receive a reduction of 2.0 percent in their Medicare Annual Payment Update for fiscal year 2008. Non-IPPS hospitals can voluntarily participate in HCAHPS.


B 2.0 Purpose and Use of Information

Three broad goals have shaped HCAHPS. First, the survey is designed to produce comparable data on the patient’s perspective on care that allows objective and meaningful comparisons between hospitals on domains that are important to consumers. Second, public reporting of the survey results is designed to create incentives for hospitals to improve their quality of care. Third, public reporting will serve to enhance public accountability in health care by increasing the transparency of the quality of hospital care provided in return for the public investment. With these goals in mind, the HCAHPS project has taken substantial steps to assure that the survey will be credible, useful, and practical. This methodology and the information it generates will be made available to the public.


There are many excellent patient surveys currently in use by hospitals. However, most of them are proprietary and are not constructed in a way that would allow patient assessment of hospital care across the country. That is, the instruments are not standardized. With a standardized instrument consumers will be able to make “apples to apples” comparisons among hospitals, allow hospitals and hospital chains to self compare, and increase the public accountability of hospitals.


A standardized instrument, developed under the CAHPS umbrella, has produced a reliable and valid instrument that any organization can use to obtain patient data about hospital experiences. This tool was adopted by the Hospital Quality Alliance. Data collection began in October 2006, and the first public reporting of participating hospitals’ HCAHPS results will occur in March 2008.


B 3.0 Use of Improved Information Technology

Since CMS is committed to allowing flexibility in the administration of HCAHPS, one of the four modes in administering the survey is active IVR. IVR, short for interactive voice response, a telephony technology in which a respondent uses a touch-tone telephone to interact with a database to acquire information from or enter data into the database. IVR technology does not require human interaction over the telephone as the user's interaction with the database is predetermined by what the IVR system will allow the user access to. For example, banks and credit card companies use IVR systems so that their customers can receive up-to-date account information instantly and easily without having to speak directly to a person. IVR technology is also used to gather information, as in the case of telephone surveys in which the user is prompted to answer questions by pushing the numbers on a touch-tone telephone. Patients selected for this mode will however be able to opt out of the interactive voice response system and to return to a “live” interviewer.

B 4.0 Efforts to Identify Duplication

Many hospitals are already carrying out their own patient experience of care surveys. These diverse surveys do not allow for comparisons across hospitals. Making comparative performance information available to the public can help consumers make more informed choices when selecting a hospital and can create incentives for hospitals to improve the care they provide.

Adding own items

Vendors/hospitals will have the option to add their own questions to the HCAHPS core questionnaire. If a hospital/vendor plans to add their own questions, they need to add them after the core HCAHPS items (questions 1- 22). The “About You” section can be placed after the core HCAHPS items or following the hospital-specific items. If a hospital/vendor decides to add their own questions, they should pay attention to the length of the questionnaire. The longer the questionnaire is, the greater the burden on respondents.


Hospitals participating in HCAHPS will submit to CMS only the data collected for HCAHPS purposes. Hospitals will retain possession of all of data they collect from the HCAHPS survey and may analyze it according to their own needs. The analysis that CMS performs is done for the purposes of public reporting.

To promote its wide and rapid adoption, HCAHPS has been carefully designed to fit within the framework of patient satisfaction surveying that hospitals currently employ. Still, CMS fully understands that participation in the HCAHPS initiative will require some effort and expense on the part of hospitals that volunteer to take part. CMS has and will continue to provide mandatory training for HCAHPS (for no fee) to fully inform hospitals and survey vendors about implementation, reporting, and other issues. In addition, hospitals will be obliged to fully trial HCAHPS (but with no public reporting of results) by conducting a “dry run” before they can fully participate in HCAHPS.


CMS does not require that hospitals drop any items from their ongoing patient satisfaction surveys in order to participate in the HCAHPS initiative. The content of the HCAHPS survey has been kept to the minimum number of items necessary to fulfill the objective of providing valid and comparable information on topics of greatest importance to consumers. CMS does suggest that hospitals consider removing current items that essentially duplicate HCAHPS items. CMS makes no recommendations with respect to hospital items that are unrelated to HCAHPS domains, or to patient surveys targeted at those not eligible for HCAHPS.


Similarly, CMS does not require or recommend that participating hospitals abandon their customized surveys of specific patient groups. The recommended number of completed HCAHPS surveys (at least 300 per year from a random sample of eligible patients; 100 from smaller hospitals) can be accommodated within the surveying plans of most hospitals that conduct patient satisfaction surveys.


A hospital’s participation in HCAHPS does not mean that it can or should survey only enough of its HCAHPS-eligible patients to produce 300 completed surveys per year. At its discretion, a hospital may submit more than 300 completed HCAHPS surveys, sample patients not eligible for HCAHPS, stratify patients according to its own criteria, or over-sample certain types of patients. During training CMS presents detailed information on sampling for HCAHPS, including how the HCAHPS sample can be accommodated within the stratified sampling schemes employed by hospitals. Detailed information on sampling and other important aspects of administering the HCAHPS survey can be found in the HCAHPS Quality Assurance Guidelines, Version 2.0, which is available on our website, www.hcahpsonline.org. In addition, policy and protocol updates, training announcements, etc. are regularly posted on this website.


There are a couple of additional issues related to integrating surveys: mixing response options on HCAHPS with the differing response options on the current vendor surveys, and revisiting domains that have already been covered earlier in the questionnaire. There are ways to handle this issue by using transitional phrasing between the HCAHPS questions and the hospital-specific ones. An example of such phrasing is as follows:

Now we would like to gather some additional detail on topics we have asked you about before. These items use a somewhat different way of asking for your response since they are getting at a little different way of thinking about the topics”


B 5.0 Involvement of Small Entities

We realize that some small hospitals that wish to voluntarily participate in HCAHPS will not be able to reach the target of 300 completes in a given 12-month period. In such cases, the hospital should sample all discharges and attempt to obtain as many completes as possible. HCAHPS results from such hospitals will be publicly reported, but the lower stability of scores based on fewer than 300 completed surveys will be noted.


According to the Abt Associates cost and benefits report, the cost for a small hospital to have 100 complete surveys ranges from $1,000 - $1,500 if HCAHPS is implemented as a stand-alone survey. It will cost approximately $326 annually if it is integrated into an existing survey.

B 6.0 Consequences if Information Collected Less Frequently

We have spent a great deal of time considering how often HCAHPS data should be collected and have solicited feedback from the public on this very issue. We received a lot of feedback on this issue from the June 27, 2003 Federal Register Notice. Two options for frequency of data collection were suggested: once during the year, and continuous. The majority of respondents to this notice suggested that continuous sampling would be easier for them to integrate into their current data collection processes. We have decided to require sampling of discharges on a more continuous basis (i.e., a monthly basis) and cumulating these samples to create rolling estimates based on 12-months of data. We chose to pursue the continuous sampling approach for the following reasons:

  • It is more easily integrated with many existing survey processes used for internal improvement.


  • Improvements in hospital care can be more quickly reflected in hospital scores (e.g., 12-month estimates could be updated on a quarterly or semi-annual basis).


  • Hospital scores are less susceptible to unique events that could affect hospital performance at a specific point in time.


  • It is less susceptible to gaming (e.g., hospitals being on their best behavior just around the survey).


  • There is less variation in time between discharge and data collection.




B 7.0 Special Circumstances

There are no special circumstances with this information collection request.

B 8.0 Federal Register Notice/Outside Consultation

A 60-day Federal Register notice was published on July 13, 2007.

Throughout the HCAHPS development process, CMS has solicited and received a great deal of public input. As a result, the HCAHPS questionnaire and methodology have gone through several iterations. The first was a 66-item version that was tested in a three-state pilot study (this was developed for testing purposes; we never intended that the final version be this long). Prior to the start of the pilot test, a Federal Register notice was published in February 2003 soliciting input on the proposed pilot study. This notice produced nearly 150 comments. Based on results of the pilot study, the questionnaire was reduced to 32 items. CMS received additional feedback from a Federal Register Notice published in June 2003 that sought further comment on the survey and implementation issues while the initial pilot testing was underway. CMS received 110 responses to the notice from hospital associations, provider groups, consumers/purchasers, and hospital survey vendors.


A 32-item version of the HCAHPS instrument was published in the Federal Register in December 2003 for public comment; CMS received nearly 600 comments that focused on the following topics: sampling, response rate, implementation procedures, cost issues, the length of the instrument, exclusion categories, question wording, and reporting.


CMS, AHRQ, the CAHPS grantees and members of the Instrument, Analysis, and Cultural Comparability teams met via conference calls two to three times per week to review all comments from the three Federal Register Notices and to modify the survey instrument and implementation strategy based on the comments. The input we received conflicted. Survey vendors and many hospitals indicated that the 32-item version was too long, while consumer groups indicated that the full content was needed to support consumer choice. After the comments were reviewed, CMS and AHRQ held additional discussions with hospitals, vendors and consumers to discuss the comments received.


Using the comments received from the public and stakeholders, and the psychometric analysis of the data from the 3-state pilot study, CMS reduced the next version of the HCAHPS questionnaire to 25 items. The following questions were eliminated from the longer versions of the questionnaire: overall rating of the doctor; overall rating of the nurse; how often did doctors treat you with courtesy and respect; how often did nurses treat you with courtesy and respect; overall mental health status; and two items related to whether and how a proxy helped complete the survey. The item regarding how often hospital staff ask if you were allergic to any medicine was also eliminated from the survey. A question from the original 66-item survey was used to replace the allergy question. This newly re-introduced item asks, “Before giving you the medicine, how often did hospital staff tell you what the medicine was for?” (Question 14 on the 25-item survey) in response to public input, we have also eliminated the reference to doctors in the screener question that identifies patients who needed help getting to the bathroom or using a bedpan (Question 8 on the 25-item survey).


The questions on overall mental health status and the two proxy questions were dropped because their impact in the patient-mix adjustment was negligible. Questions about being shown courtesy and respect by doctors and nurses were pulled from the latest version because input received indicated that the two items on doctors or nurses “explaining things fully” and “listening carefully” were synonymous with showing courtesy and respect. Taking these questions out of these composites did not adversely impact the psychometric properties of the doctor and nurse communication composites. The allergy question originally had a scale of “never to always’ that didn’t work well because the question is usually asked once upon admission. Changing the scale to a “yes/no” response provided very little variation across hospitals.


Following a thorough, multi-stage review process, HCAHPS was endorsed by the National Quality Forum (NQF) board in May 2005. In the process, NQF recommended a few modifications to the instrument. As a result of the recommendation of the National Quality Forum Consensus Development Process, the two courtesy and respect items were added back into the survey. The review committee felt that these questions are important to all patients, but may be particularly meaningful to patients who are members of racial and ethnic minority groups. The two reinstated items were: “During this hospital stay, how often did nurses treat you with courtesy and respect?”, and “During this hospital stay, how often did doctors treat you with courtesy and respect?”.

B 9.0 Payments/Gifts to Respondents

There are no provisions for payments or gifts to respondents.

B 10.0 Assurance of Confidentiality

All information obtained through the survey will be reported in the aggregate. No individual respondents’ information will be reported independently or with identifying information.

We have designed the data files so that the hospital/vendor will be submitting a de-identified dataset to CMS through a QIO according to 45 CFR Section § 164.514.


In all the modes of survey administration, guidelines are included on issues related to confidentiality:

  • Cover letters will not be attached to the survey

  • Respondents’ names will not be appear on the survey

  • Interviewers will not leave messages on answering machines or with household member since this could violate a respondent’s privacy

B 11.0 Information of a Sensitive Nature

There are no questions of a sensitive nature on the survey.

B 12.0 Estimates of Annualized Burden

National Implementation

We have assumed that roughly 4,000 hospitals would volunteer to participate in the national implementation of HCAHPS yielding approximately 3,000,000 potential eligible patients. We realize that this is an overestimate of participation rates, but we do not know at this point how many hospitals will volunteer for this initiative. For now we based this overestimation on 90% of the 4000 hospitals surveying 750 eligible discharges and 10% of the hospitals surveying 300 eligible discharges. The time to complete the survey is 7 minutes.

2007 2,820,000 attempts 329,940 hours

B 13.0 Capital Costs

There is no capital cost associated with this information collection request.

B 14.0 Estimates of Annualized Cost to the Government

Costs to the government are for training, technical assistance, ensuring the integrity of the data, approving hospitals/vendors for HCAHPS; accumulating the data; analyzing the data; making adjustments for patient-mix and mode of administration; and public reporting.


Hospitals have the option to conduct HCAHPS as a stand-alone survey or to integrate it with their existing survey activities. They can choose to administer HCAHPS by mail, phone, or active IVR.


Costs associated with collecting HCAHPS will vary depending on:

    • The method hospitals currently use to collect patient survey data;

    • The number of patients surveyed (target is 300 per year); and

    • Whether it is possible to incorporate HCAHPS into their existing survey.


Abt estimates that the average data collection cost for a hospital conducting the 27-item version of HCAHPS as a stand-alone survey would be between $3,300 - $4,575 per year, assuming that 80-85 percent of hospitals collect HCAHPS by mail and the remainder by phone or active IVR. The costs for combining HCAHPS with existing surveys would be considerably less. It would cost $978 per hospital annually to incorporate the 27-item version of HCAHPS into existing surveys.


Abt Associates estimates that the nationwide data collection cost for conducting the 27-item version of HCAHPS would be between $4.1 million and $19.1 million per year if all eligible hospitals participated, depending upon the extent to which hospitals integrate HCAHPS with their existing survey activities or conduct it as a stand-alone survey.

B 15.0 Changes in Burden

Based upon experience gained in the dry run and mode experiment, we have increased the average number of minutes to complete the HCAHPS survey from 6 to 7 minutes. Also, because the mode experiment was completed in 2006, we have removed the data collection burden associated with it.


B 16.0 Time Schedule, Publication, and Analysis Plans

Following OMB approval, introductory training for the implementation of the HCAHPS was initiated by CMS in January 2006. In January 2006, an in-person HCAHPS Introductory Training session was held at CMS Central Office in Baltimore, Maryland, followed by several webinar-based training sessions. The HCAHPS Introductory Training (webinar-based) was repeated for the benefit of newly joining self-administering hospitals and survey vendors in April 2006 and January 2007.


In addition to the introductory training, in May 2007 CMS offered HCAHPS Update Training. All survey vendors and hospitals that plan to administer HCAHPS are required to attend both the one-day HCAHPS Introductory Training and the HCAHPS Update Training sessions..


Approximately 2,700 hospitals began voluntarily collecting HCAHPS data on an ongoing basis in October 2006. We anticipate that in July 2007 (when participation in HCAHPS is tied to the annual payment update for RHQDAPU-eligible IPPS hospitals), approximately 4,000 hospitals will be collecting HCAHPS data on an ongoing basis.


The first public reporting of hospital results on HCAHPS will occur in March 2008. The results presented at that time will be based on data collected from patients discharged from October 2006 to June 2007. Though this initial reporting period covers nine months, the target for hospitals was at least 300 completed surveys. As stated earlier, the HCAHPS measures will be adjusted for survey mode and patient-mix effects before they are publicly reported on the Hospital Compare website. After the initial public reporting in March 2006, the HCAHPS results will be updated quarterly and will be based on 12 months of data, with the oldest quarter rolled out while the newest quarter is rolled in.

B 17.0 OMB Expiration Date Exemption

CMS would like an exemption from displaying the expiration date on this collection as the survey will be used on a continuing basis. To include an expiration date would result in having to discard a potentially large number of surveys.


B 18.0 Exceptions to Certification Statement

The proposed data collection does not involve any exceptions to the certification statement identified in line 19 of OMB Form 83-I.

Supporting Statement – Part B:

COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


C 1.0 Respondent Universe and Sampling Methods


National Implementation


Population and sampling frame


HCAHPS is designed to collect data on care from the patient’s perspective for general acute care hospitals. Pediatric, psychiatric, and other specialty hospitals are not included (additional/different aspects of care need to be examined for these specialty settings).

Within general acute care hospitals, HCAHPS scores are designed to reflect the care received by patients of all payers (not just Medicare patients). Specifically, this includes the population of non-psychiatric and non-pediatric patients who had an overnight stay in the hospital and were alive when discharged.


Pediatric patients (under age 18 at admission) and psychiatric patients are excluded because the current HCAHPS instrument is not designed to address the behavioral health issues pertinent to psychiatric patients or the situation of pediatric patients and their families. Patients who did not have an overnight stay are excluded because we don’t want to include patients who had very limited inpatient interaction in the hospital (e.g., patients who were admitted for a short period for purely observational purposes; patients getting only outpatient care are not included in HCAHPS). Patients who died in the hospital are excluded because HCAHPS is designed to capture information from the perspective of the patient, not the family. Some states have further restrictions on patients who may be contacted. Hospitals/vendors should exclude other patients as required by law or regulation in the state in which they operate.


The monthly sampling frame that defines this population for a given hospital includes all discharges between the first and last days of the month, with the exclusions noted above.


Sampling approach

To keep sampling as simple as possible, the hospital/vendor draw a simple random sample each month from the sampling frame of relevant discharges, as noted above. Sampling can be done at one time after the end of the month or continuously throughout the month as long as a simple random sample is generated for the month (the hospital/vendor can chose what works best with their current survey activities for internal improvement).


As noted above, monthly data will be cumulated to produce a rolling 12-month data file for each hospital that will be used to produce hospital-level scores for the HCAHPS measure. A target for statistical precision of these scores is based on a reliability criterion. Higher reliability means higher “signal to noise” in the data. The target is that the reliability for the global ratings and most composites be 0.8 or more. Based on this, it will be necessary for each hospital to obtain 300 completed HCAHPS questionnaires over a 12 month period.


In order to calculate the number of discharges that need to be sampled each month to reach this target, it is necessary to take into account the proportion of sampled patients expected to complete the survey (not all sampled patients who are contacted to complete the survey actually complete it; the target is 300 completes). This is a function of the proportion of sampled patients who turn out to be ineligible for the survey and the survey response rate among eligible respondents. The calculation of the sample size needed each month can be summarized as follows:


Step 1 – Identify the number of completes needed over 12 months

C = number of completes needed = 300


Step 2 – Estimate the proportion of sampled patients expected to complete the survey

Let:

I = expected proportion of ineligible sampled patients

R = expected survey response rate among eligible respondents


P = proportion of sampled patients expected to complete the survey

= (1 – I) x R


Step 3 – Calculate the number of discharges to sample


N12 = sample size needed over 12 months = C / P

N1 = Sample size needed each month = N12 / 12



Some small hospitals will not be able to reach the target of 300 completes in a given 12-month period. In such cases, the hospital should sample all discharges and attempt to obtain as many completes as possible.


Response rate

A great deal of public input regarding response rate was received. We seriously considered all of this input; as a result we’ve chosen an administration protocol to achieve at least on average a 40% response rate. CMS is encouraging hospitals/vendors to strive for higher response rates. As part of the reporting displays for HCAHPS, we plan to report response rates.

The raw response rate is the total number of completed surveys divided by the total number of respondents sampled. We recommend that a questionnaire be considered complete if half or more of the items applicable to everyone (i.e., excluding items that apply to only a subset of hospitalized patients) are completed. In addition, the denominator of the raw response rate should be adjusted for any patients who were included in the sample but were ineligible (e.g. were deceased during the data collection period). Vendors should make every effort to obtain complete surveys from all eligible respondents. Bad contact information (out-of-date addresses or phone numbers) and other reasons (respondent out of town / on vacation) are eligible cases that should not be removed from the denominator.


C 2.0 Information Collection Procedures

Flexibility in the mode of administering the survey is permitted. The hospitals/vendors may use any of the following modes: telephone only, mail only, a mixed methodology of mail with telephone follow-up, or active interactive voice response (IVR). All modes of administration require following a standardized protocol. These four survey modes were chosen to achieve, on average, at least a 40 percent response rate. CMS has conducted a large-scale mode experiment to detect for the presence of, then develop adjustments for, survey mode effects, as described earlier. Before publicly reporting HCAHPS results, CMS will adjust the hospital-level scores for differences across hospitals in terms of the mode of administration. The algorithm for the patient-mix adjustment can be found in Appendix B.


Modes of Survey Administration

Mail Only

For the mail only option, the hospital/vendor is required to send the HCAHPS questionnaire alone or combined with hospital-specific questions with a cover letter between 48 hours and 6 weeks following discharge. CMS provides sample cover letters to hospitals/vendors in their training program for HCAHPS. The hospitals/vendors may tailor their letters, but the letters must contain information about the purpose of the survey, and state that participation in the survey is voluntary and will not affect their patients’ health care benefits.


The hospital/vendor is required to send a second questionnaire with a reminder/thank you letter to those not responding after the first mailing. Data collection is closed out for a particular respondent 6 weeks following the mailing of the first questionnaire.


Telephone Only

For the telephone only option, the hospital/vendor is required to begin data collection between 48 hours and 6 weeks following discharge. The hospital/vendor must attempt to contact respondents up to 5 times unless the respondent explicitly refuses to complete the survey. These attempts must be made on different days of the week and different times of the day, in different weeks to ensure that as many respondents are reached as feasible. Data collection is closed out for a particular respondent 6 weeks following the first telephone attempt.


For the HCAHPS portion of the survey, CMS provides scripts to follow for the telephone interviewing. The interviewers conducting the survey must be trained before beginning interviewing. In their training program for HCAHPS, CMS provides guidance on how to train interviewers to conduct HCAHPS. The training program ensures that interviewers are reading questions as worded, interviewers are using non-directive probes, interviewers are maintaining a neutral and professional relationship with the respondent, and interviewers are recording only the answers that the respondents themselves choose.


Mixed Mode

A third option is a combination of mail and telephone. In this mixed mode of administration, there is one wave of mailing (cover letter and questionnaire) and up to five telephone call-back attempts for non-respondents. The first survey must be sent out between 48 hours and six weeks following discharge. Telephone follow-up occurs for non-respondents to the questionnaire mailing. The telephone attempts must be made on different days of the week and different times of the day, in different weeks to ensure that as many respondents are reached as feasible. Telephone interviewing ends 6 weeks after the first survey mailing. Similar to the telephone only mode, CMS provides telephone scripts for the hospitals/vendors to follow.

Active IVR

For active IVR, hospitals/vendors must initiate data collection by phone between 48 hours to 6 weeks following discharge. A live interviewer first asks the respondent if she/he is willing to complete the survey using the IVR system. Through the IVR system the respondent completes the survey using their touch-tone keypad on their phone. The hospital/vendor is required to provide an option for the respondent to opt out of the system and return to a live interviewer.


Similar to the telephone mode, the hospital/vendor must call each respondent up to 5 times unless the respondent refuses to complete the survey. These attempts must be made on different days of the week and different times of the day and in different weeks to ensure that as many respondents are reached as feasible. Data collection is closed out for a particular respondent 6 weeks following the first telephone attempt.


C 3.0 Methods to Maximize Response Rates

Ways to avoid problems and increase response rates have been integrated into the current data collection protocol; among these are:

  1. To avoid undeliverable mail, the lag time between the date of discharge and the mailing of the questionnaire has been shortened.


  1. The most specific and locally used hospital name is used to increase patient recognition in the cover letter.


  1. Response rates will be published on the website displays.


4. We require two waves of mailing of the survey (mail only protocol) to boost response rates.

Many hospitals currently do a one-wave mailing.


5. We require 5 call-back attempts for the telephone only protocol, the mixed mode protocol, and the IVR protocol.


C 4.0 Tests of Procedures

Most of the questions on the survey instrument are modifications of existing survey questions from hospital satisfaction surveys conducted by private survey vendors or modifications of current CAHPS questions. Some questions have been newly developed by CAHPS grantees that have had many years of experience developing these types of questions. The HCAHPS instrument and data collection protocol have been modified to include input from consumers, hospitals and vendors.


CMS carried out a three state pilot test with adult medical, surgical, and obstetric patients at 132 hospitals. Data collection took place between June 2003 and October 2003 with a sample size of 49,812 inpatient events. In the pilot, sampled patients were sent an advance letter, which was then followed about one week later by another mailing that contained the questionnaire. After another 10 days, a postcard was mailed. Approximately four weeks after the mailing of the postcard the non-responding patients from core hospitals were then followed up by telephone; a maximum of five follow-up phone attempts was used to complete the interview with core responders. Four weeks following the first mailing of the postcard the non-core patients were followed-up by a second mailing of the questionnaire. The final response rate was 35% for the non-core sample that used mail only administration. There was an almost 11 percentage point increase in the response rate due to the second mailing of the questionnaire. The final response rate was 45.6% for the core sample that included following up by telephone. The telephone follow-up accounted for an almost 18 percentage point increase in the response rate.


Connecticut state legislation requires that all licensed hospitals in the state be compared on patient satisfaction measures. In order to meet this mandate, Connecticut administered the 66-item version of the HCAHPS survey to patients from all 30 acute care hospitals in Connecticut. The data collection protocol included the same patient service areas covered in the three-state pilot areas (medical, surgical, and ob/gyn) and administered in two languages, English or Spanish. Data collection strategies included two survey mailings without a reminder/thank you postcard or letter. Replication of patient-level psychometric analysis of the HCAHPS instrument using the 3-state pilot sample (Arizona, Maryland, and New York) and the Connecticut sample was conducted.


The revised HCAHPS measure is compiled into the following seven composites: communication with nurses (n = 3), communication with doctors (n=3), communication about medicine (n = 2), nursing services (n = 2), discharge information (n = 2), pain control (n = 2), and cleanliness and quiet of the physical environment (n = 2). Within both samples, the reliability of the seven composites was estimated using the internal consistency method (Cronbach’s alpha coefficient). The construct validity of the composites was evaluated with regard to their relationships to an overall rating of the hospital (Hospital Rating) and whether the patient would recommend the hospital to others (Hospital Recommendation). The results of these analyses indicate that the revised HCAHPS measure performed similarly in the CT and three-state pilot data sets. See Appendix C for details.


Dry Run


Self-administering hospitals and survey vendors are required to conduct a “dry run” of the HCAHPS survey before they can participate. The purpose of the dry run is for self-administering hospitals and survey vendors to gain experience in every phase of the survey process and its protocols, including data submission to the HCAHPS data warehouse. Dry runs have been conducted in Spring 2006 and March 2007, and the next dry run is scheduled for September 2007. Using the official survey instrument and the approved modes of implementation and data collection protocols, self-administering hospitals and survey vendors collect HCAHPS data for eligible patients and submit it to CMS via QualityNet Exchange. In addition to providing real experience, participation in the dry run produces HCAHPS data that the hospital can use for self-assessment. Data collected for the dry run is not publicly reported.


The dry runs have provided additional benefits to CMS by providing a test of its HCAHPS-related systems and a check on the usability of the HCAHPS protocols. Based upon hospitals’ experience in the first dry run, CMS made several improvements to the data coding and submission processes, as well as deciding to exclude several small categories of patients from eligibility for logistical reasons. These patient categories are: patients who were prisoners, who were discharged to hospice care, who had a foreign home address, or who upon admission had requested that the hospital not survey them.

C 5.0 Statistical Consultants

Marc Elliott - RAND

Jack Fowler – University of Massachusetts, Boston

Steven Garfinkel – AIR

Ron Hays – Rand

San Keller – AIR

Roger Levine – AIR

Alan Zaslavsky – Harvard University

Appendix A

HCAHPS Survey Instrument

Inside Cover page Approved OMB #: 0938-0981

Hospital CAHPS®

Cover Letter

Statement of Burden


According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information collection is 0938-0981. The time required to complete this information collection is estimated to average 7 minutes per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: Centers for Medicare & Medicaid Services, Attn: Report Clearance Officer, 7500 Security Boulevard, Baltimore, Maryland 21244-1850.


CMS-10102

CAHPS Hospital Survey

SURVEY INSTRUCTIONS


  • You should only fill out this survey if you were the patient during the hospital stay named in the cover letter. Do not fill out this survey if you were not the patient.

  • Answer all the questions by checking the box to the left of your answer.

  • You are sometimes told to skip over some questions in this survey. When this happens you will see an arrow with a note that tells you what question to answer next, like this:

Yes

No If No, Go to Question 1


You may notice a number on the cover of this survey. This number is ONLY used to let us know if you returned your survey so we don't have to send you reminders.

Please note: Questions 1-22 in this survey are part of a national initiative to measure the quality of care in hospitals. OMB #0938-0981



Please answer the questions in this survey about your stay at the hospital named on the cover. Do not include any other hospital stay in your answers.


YOUR CARE FROM NURSES



1. During this hospital stay, how often did nurses treat you with courtesy and respect?

1 Never

2 Sometimes

3 Usually

4 Always



2. During this hospital stay, how often did nurses listen carefully to you?

1 Never

2 Sometimes

3 Usually

4 Always



3. During this hospital stay, how often did nurses explain things in a way you could understand?

1 Never

2 Sometimes

3 Usually

4 Always


4. During this hospital stay, after you pressed the call button, how often did you get help as soon as you wanted it?

1 Never

2 Sometimes

3 Usually

4 Always

9 I never pressed the call button


YOUR CARE FROM DOCTORS



5. During this hospital stay, how often did doctors treat you with courtesy and respect?

1 Never

2 Sometimes

3 Usually

4 Always



6. During this hospital stay, how often did doctors listen carefully to you?

1 Never

2 Sometimes

3 Usually

4 Always


7. During this hospital stay, how often did doctors explain things in a way you could understand?

1 Never

2 Sometimes

3 Usually

4 Always


THE HOSPITAL ENVIRONMENT



8. During this hospital stay, how often were your room and bathroom kept clean?

1 Never

2 Sometimes

3 Usually

4 Always


9. During this hospital stay, how often was the area around your room quiet at night?

1 Never

2 Sometimes

3 Usually

4 Always

YOUR EXPERIENCES IN THIS HOSPITAL



10. During this hospital stay, did you need help from nurses or other hospital staff in getting to the bathroom or in using a bedpan?

1 Yes

2 No If No, Go to Question 12


11. How often did you get help in getting to the bathroom or in using a bedpan as soon as you wanted?

1 Never

2 Sometimes

3 Usually

4 Always


12. During this hospital stay, did you need medicine for pain?

1 Yes

2 No If No, Go to Question 15


13. During this hospital stay, how often was your pain well controlled?

1 Never

2 Sometimes

3 Usually

4 Always


14. During this hospital stay, how often did the hospital staff do everything they could to help you with your pain?

1 Never

2 Sometimes

3 Usually

4 Always


15. During this hospital stay, were you given any medicine that you had not taken before?

1 Yes

2 No If No, Go to Question 18


16. Before giving you any new medicine, how often did hospital staff tell you what the medicine was for?

1 Never

2 Sometimes

3 Usually

4 Always


17. Before giving you any new medicine, how often did hospital staff describe possible side effects in a way you could understand?

1 Never

2 Sometimes

3 Usually

4 Always


WHEN YOU LEFT THE HOSPITAL



18. After you left the hospital, did you go directly to your own home, to someone else’s home, or to another health facility?

1 Own home

2 Someone else’s home

3 Another health

facility If Another, Go to Question 21


19. During this hospital stay, did doctors, nurses or other hospital staff talk with you about whether you would have the help you needed when you left the hospital?

1 Yes

2 No


20. During this hospital stay, did you get information in writing about what symptoms or health problems to look out for after you left the hospital?

1 Yes

2 No


OVERALL RATING OF HOSPITAL



Please answer the following questions about your stay at the hospital named on the cover. Do not include any other hospital stays in your answer.

21. Using any number from 0 to 10, where 0 is the worst hospital possible and 10 is the best hospital possible, what number would you use to rate this hospital during your stay?

00 0 Worst hospital possible

01 1

02 2

03 3

04 4

05 5

06 6

07 7

08 8

09 9

10 10 Best hospital possible


22. Would you recommend this hospital to your friends and family?

1 Definitely no

2 Probably no

3 Probably yes

4 Definitely yes


ABOUT YOU

There are only a few remaining items left.

23. In general, how would you rate your overall health?

1 Excellent

2 Very good

3 Good

4 Fair

5 Poor


24. What is the highest grade or level of school that you have completed?

1 8th grade or less

2 Some high school, but did not graduate

3 High school graduate or GED

4 Some college or 2-year degree

5 4-year college graduate

6 More than 4-year college degree


25. Are you of Spanish, Hispanic or

Latino origin or descent?



1 No, not Spanish/Hispanic/Latino

2 Yes, Puerto Rican

3 Yes, Mexican, Mexican-American, Chicano

4 Yes, Cuban

5 Yes, other Spanish/Hispanic/Latino



26. What is your race? Please choose one or more.

1 White

2 Black or African American

3 Asian

4 Native Hawaiian or other Pacific Islander

5 American Indian or Alaska Native




27. What language do you mainly speak at home?

1 English

2 Spanish

3 Some other language (please print): _____________________


THANK YOU


Please return the completed survey in the postage-paid envelope



Appendix B


Patient-mix adjustment


Patient-mix Adjustment

Let represent the response to item i of respondent j from hospital p (after recoding, if any, has been performed). The model for adjustment of a single item i is of the form;



where is a regression coefficient vector, is a covariate vector consisting of six or more adjuster covariates (as described above), is an intercept parameter for hospital p, and is the error term. The estimates are given by the following equation:



where is the vector of intercepts, is the vector of responses and the covariate matrix is



where the columns of are the vectors of values of each of the adjuster covariates, and is a vector of indicators for being discharge from hospital p, p = 1, 2,…P, with entries equal to 1 for respondents in hospital p and 0 for others.


Finally, the estimated intercepts are shifted by a constant amount to force their mean to equal the mean of the unadjusted hospital means (to make it easier to compare adjusted and unadjusted means), giving adjusted hospital means



For single-item responses, these adjusted means are reported. For composites, the several adjusted hospital means are combined using the weighted mean


Appendix C

Replication of Patient-Level Psychometric Analysis of the

HCAHPS Instrument

Across Two Samples


Replication of Patient-Level Psychometric Analysis of the

HCAHPS Instrument Across Two Samples


Analyses were conducted to examine the psychometric properties of the revised form of the HCAHPS questionnaire used in two samples: (1) 3-State Pilot (3SP)1, N=19,568 and (2) Connecticut (CT), N=1,675.


The basic unit of reporting for the HCAHPS survey measure is the hospital. Thus, it is most appropriate to focus on the psychometric features of the measures at the hospital level. Hospital-level reliability captures the extent to which variation in scores on a composite reflects variation between hospitals, as opposed to random variation in patient response within hospitals. Hospital level correlations for construct validity capture the extent to which hospitals with high scores on the composites also have high scores on patient willingness to recommend the hospital and the overall rating. Hospital-level reliabilities of the composites and hospital-level correlations of the composites with the global ratings for the three-state pilot were presented previously. Because of more limited data available in the Connecticut pilot individual-level analyses were conducted (which also can be informative) and are presented below.


The HCAHPS measure was compiled into the following seven composites: Communication with Nurses (n=3), Communication with Doctors (n=3), Communication about Medicine (n=2), Nursing Services (n=2), Discharge Information (n=2), Pain Control (n=2), and Physical Environment (n=2). (One item from the doctor communication and nurse communication composites was subsequently dropped.) Within both samples, the reliability of the seven composites was estimated using the internal consistency method (Cronbach’s alpha coefficient). The construct validity of the composites was evaluated with regard to their relationships to an overall rating of the hospital (Hospital Rating) and whether the patient would recommend the hospital to others (Hospital Recommendation). The results of these analyses indicate that the HOSPITAL CAHPS SURVEY measure performed similarly in the CT and 3SP data sets.


The alpha coefficients across the two data sets were comparable (see Table 1): in the 3SP data file the alpha coefficients ranged from .51 to .88 and in the CT sample the alphas ranged from .50 to .87. The same four of seven composites within both samples had alpha coefficients greater than .70. These were Communication with Nurses, Communication with Doctors, Nursing Services, and Pain Control. Analyses also revealed the same relationships across data sets of items to competing composites (see Table 1). Within both samples, the items comprising the Nursing Services composite (Q22, Q9) were correlated as or more strongly with the Communication with Nurses composite than with their own composite. And, items within the Physical Environment composite were correlated as or more strongly with three other composites (Communication with Nurses, Nursing Services, and Pain Control) than they were with their own composite. These relationships were found in the analyses presented in HOSPITAL CAHPS SURVEY Three-State Pilot Study Analysis Report at www.cms.hhs.gov/quality/hospital . For purposes of this analysis it is important to note that the relationships found in the 3SP data were replicated in the CT data.


With one exception, the results of the construct validity analyses were also very similar in the two data sets (see Table 1, aR2 values). Compared to the 3SP sample, Communication with Doctors had a somewhat stronger relationship to global ratings of hospital care in the CT sample. Table 2 presents a comparison of the descending rank order of correlations of the composite scores with the following global ratings: hospital rating, nurses rating, doctors rating, and hospital recommendation. As illustrated in the table, three composites emerge consistently, within both samples, as being among the most highly correlated with the global ratings: Nurse Communication, Nursing Services, and Pain Control.

Table 1: A Comparison of the Patient-Level Psychometric Analyses of the Seven-Factor Hospital-Level Structure between the 3-State Pilot and Connecticut Samples


3 State Pilot Sample



Integrity of Composites

Relationship of Item and Composite-Level Scores to Hospital Rating and Recommendation**

Quest #

Question Label

Substantial Corr. w 2nd Composite

Alpha &

Item-Total

Corr.


Hospital Rating



Recommend Hospital

(1) Communication with Nurse

α = .86

aR2=0.47*

t-value= 101.55

aR2=0.37

t-value= 79.02

Q5

RN Listen


.77


38.70


28.59

Q4

RN Respect


.73


48.02


39.35

Q6

RN Explain


.69


21.62


16.22

(2) Communication with Doctors

α = .88

aR2=0.24

t-value= 41.15

aR2=0.19

t-value= 34.67

Q12

MD Listen


.81


13.07


9.20

Q11

MD Respect


.76


21.06


19.13

Q13

MD Explain


.73


1.20#


1.49#

(3) Communication about Medication

α = .67

aR2=0.18

t-value= 12.41

aR2=0.14

t-value= 7.99

Q40

Allergies to Medicines


.51


9.73


5.97

Q41

Side-Effects of Medicine


.51


5.46


3.92

(4) Nursing Services

α = .72

aR2=0.36

t-value= 36.05

aR2=0.28

t-value= 28.67

Q22

How often Bathroom

1(.56)

.56


18.35


16.17

Q9

Help when Call Button

1(.63)

.56


25.22


18.39

(5) Discharge Information

α = .51

aR2=0.08

t-value= 20.33

aR2=0.07

t-value= 19.30

Q49

Symptoms may have


.35


11.49


12.28

Q48

Help for you at home?


.35


14.54


12.01

(6) Pain Control

α = .83

aR2=0.30

t-value= 39.96

aR2=0.24

t-value= 34.16

Q32

Pain Controlled


.71


11.98


9.94

Q33

Pain Help All Can


.71


23.95


20.84

(7) Physical Environment

α = .51

aR2=0.26

t-value= 52.08

aR2=0.19

t-value= 33.19

Q17

Room Clean

1(.43)

4(.44)

6(.34)

.34


45.11


31.94

Q18

Room Quite

1(.35)

4(.39)

.34


22.64


11.21


* aR2 = Adjusted R-squared, how much variance in the dependent variable is accounted for by the set of items in the composite controlling for the effect of number of variables (i.e. all things being equal, a larger set of items will account for a larger percentage of the variance.

** t-values listed in grey cells are for the unique relationship of this composite to the criterion variable controlling for the other composites. t-values in cells adjacent to the item are for the unique relationship of that item controlling for the other report items in the questionnaire, therefore these are the same values as those depicted in Table 5. Probability of t-value is less than 0.01 unless otherwise denoted.

α = Cronbach’s alpha coefficient, an estimate of internal consistency reliability.

# = p > 0.01



Connecticut Sample



Integrity of Composites

Relationship of Item and Composite-Level Scores to Hospital Rating and Recommendation**

Quest #

Question Label

Substantial Corr. w 2nd Composite

Alpha &

Item-Total

Corr.


Hospital Rating



Recommend Hospital

(1) Communication with Nurse

α = .85

aR2=0.45*

t-value=

26.01

aR2=0.34

t-value= 17.47

Q5

RN Listen


.75


7.57


3.67

Q4

RN Respect


.71


14.81


14.30

Q6

RN Explain


.68


5.65


1.62#

(2) Communication with Doctors

α = .87

aR2=0.30

t-value= 16.80

aR2=0.22

t-value= 11.57

Q12

MD Listen


.80


1.27#


2.57#

Q11

MD Respect


.74


7.92


7.22

Q13

MD Explain


.71


6.42


1.14#

(3) Communication about Medication

α = .69

aR2=0.19

t-value= 2.39#

aR2=0.15

t-value= 1.33#

Q40

Allergies to Medicines


.52


-0.18#


.88#

Q41

Side-Effects of Medicine


.52


3.51


1.54#

(4) Nursing Services

α = .71

aR2=0.35

t-value= 12.08

aR2=0.29

t-value= 13.10

Q22

How often Bathroom

1(.52)

.55


7.28


9.94

Q9

Help when Call Button

1(.61)

.55


7.41


5.42

(5) Discharge Information

α = .50

aR2=0.08

t-value= 5.65

aR2=0.08

t-value= 8.16

Q49

Symptoms may have


.33


1.99#


2.53#

Q48

Help for you at home?


.33


5.23


7.49

(6) Pain Control

α = .81

aR2=0.32

t-value= 16.58

aR2=0.25

t-value= 13.12

Q32

Pain Controlled


.68


7.07


6.07

Q33

Pain Help All Can


.68


8.44


6.45

(7) Physical Environment

α = .51

aR2=0.27

t-value= 19.07

aR2=0.19

t-value= 11.98

Q17

Room Clean

1(.38)

4(.39)

6(.35)

.34


14.17


8.71

Q18

Room Quite

1(.35)

4(.40)

.34


9.79


6.01

* aR2 = Adjusted R-squared, how much variance in the dependent variable is accounted for by the set of items in the composite controlling for the effect of number of variables (i.e. all things being equal, a larger set of items will account for a larger percentage of the variance.

** t-values listed in grey cells are for the unique relationship of this composite to the criterion variable controlling for the other composites. t-values in cells adjacent to the item are for the unique relationship of that item controlling for the other report items in the questionnaire, therefore these are the same values as those depicted in Table 5. Probability of t-value is less than 0.01 unless otherwise denoted.

α = Cronbach’s alpha coefficient, an estimate of internal consistency reliability.

# = p > 0.01





Table 2: Comparison of Rank Order (Descending) of Correlations of Composite Scores with Global Ratings between the 3 State Pilot and Connecticut Samples


3 State Pilot

Hospital Rating

Nurses Rating

Doctors Rating

Recommend Hospital

Nurse Com

.68

Nurse Com

.79

Doctor Com

.79

Nurse Com

.60

Nursing Services

.60

Nursing Services

.66

Nurse Com

.48

Nursing Services

.52

Pain

.54

Pain

.54

Pain

.44

Pain

.48

Physical Environ.

.50

Physical Environ.

.47

Nursing Services

.42

Doctor Com

.43

Doctor Com

.48

Doctor Com

.42

Medicine

.37

Physical Environ.

.42

Medicine

.42

Medicine

.41

Physical Environ.

.33

Medicine

.37

Discharge

.28

Discharge

.25

Discharge

.25

Discharge

.26



Connecticut

Hospital Rating

Nurses Rating

Doctors Rating

Recommend Hospital

Nurse Com

.67

Nurse Com

.76

Doctor Com

.80

Nurse Com

.58

Nursing Services

.59

Nursing Services

.67

Nurse Com

.51

Nursing Services

.54

Pain

.55

Pain

.53

Pain

.44

Pain

.48

Doctor Com

.54

Doctor Com

.48

Nursing Services

.44

Doctor Com

.47

Physical Environ.

.51

Physical Environ.

.45

Medicine

.37

Physical Environ.

.43

Medicine

.43

Medicine

.44

Physical Environ.

.36

Medicine

.38

Discharge

.27

Discharge

.25

Discharge

.25

Discharge

.27

1 The 3SP dataset is comprised of data obtained in Arizona, Maryland, and New York.

58

File Typeapplication/msword
File TitleJustification of the Hospital CAHPS Survey
AuthorCMS
Last Modified ByCMS
File Modified2007-09-22
File Created2007-07-02

© 2024 OMB.report | Privacy Policy