Health Insurance Exchange Consumer Experience Surveys: Qualified Health Plan Enrollee Experience Survey
Supporting
Statement—Part B
Collections of Information Employing
Statistical Methods
OMB Control Number: 0938-1221
Centers for Medicare & Medicaid Services
Table 0f Contents
1. Potential Respondent Universe and Sampling Methods 1
2. Information Collection Procedures 2
3. Methods to Maximize Response Rates and Address Non-Response Bias 5
3.1 Evaluating Non-Response Bias 5
4. Tests of Procedures 6
5. Statistical Consultants 6
Appendix A: Non-response Bias Analysis of the 2019 QHP Enrollee Survey 7
Appendix B: 2019 QHP Enrollee Survey Case-Mix Adjusters Variables 11
List of Exhibits
Exhibit B1. Data Collection Protocol for 2021 QHP Enrollee Survey 3
Exhibit B2. QHP Enrollee Survey Response Rates 5
Exhibit B3. 2019 Response Rates by Enrollee Characteristics 7
Exhibit B4. Multivariable Logistic Regression – Demographic Associations with Survey Completion 8
This supporting statement includes information in support of the Qualified Health Plan (QHP) Enrollee Experience Survey (QHP Enrollee Survey or survey). The Centers for Medicare & Medicaid Services (CMS) developed the QHP Enrollee Survey, which contributes a subset of survey measures to the Quality Rating System (QRS) as directed by Section 1311(c)(4) of the Patient Protection and Affordable Care Act (PPACA). CMS designed the QRS to provide comparable and useful information to consumers about the quality of health care services and enrollee experience with QHPs offered through a Health Insurance Exchange (Exchange) (also known to consumers as to consumers as Health Insurance Marketplaces).1
As outlined in 45 CFR § 156.1125(b), QHP issuers are required to conduct the QHP Enrollee Survey if they have been operational for two consecutive years as the exact same product type, had more than 500 enrollees as of July 1 in the prior survey year, and have more than 500 enrollees as of January 1 of the ratings year.
CMS has established the reporting unit as the unique state-product type offered by a QHP issuer through the Exchange, including QHPs in both the SHOP and the individual market. The product type is defined as the discrete package of health insurance coverage benefits that a health plan insurance issuer offers using a particular product network type (i.e., health maintenance organization [HMO], preferred provider organization [PPO], exclusive provider organization [EPO], and point of service [POS]) within a service area. QHP issuers will create a sample frame for each reporting unit they offer through the Exchange. Child-only QHPs and standalone dental plans (SADPs) are excluded from the QHP Enrollee Survey at this time.
Eligible reporting units for the 2021 QHP Enrollee Survey are those that meet all of the following criteria:
The reporting unit was offered through an Exchange in 2020;
The reporting unit will be offered as the exact same product through an Exchange in 2021; and
The reporting unit had more than 500 enrollees on July 1, 2020 and January 1, 2021.
Issuers must collect and report survey data for all eligible reporting units to comply with the regulation. For the 2021 survey, CMS estimates that no more than 275 reporting units will be required to field the QHP Enrollee Survey. This estimate is primarily based on the 238 reporting units that met the participation criteria for the 2020 QHP Enrollee Survey2. In comparison, there were 205 reporting units in 2018 and 214 reporting units in 2019.
1.2 Sample Frame & Respondent Universe
QHP issuers generate the sample frame for each reporting unit, then provide the sample frames to their contracted HHS-approved survey vendor, which is responsible for drawing a simple random sample. The sample frame must include every enrollee within an eligible reporting unit who meets the following criteria:
Member is enrolled in an eligible QHP that is offered through the Exchange and provides family and/or adult medical coverage, and
Member is 18 years or older on December 31, 2020, and
Member has been enrolled in the eligible QHP from July 1, 2020 through December 31, 2020 with no more than one 45-day break in enrollment during those six months, and
Member is still enrolled in the eligible QHP on the specified anchor date, such as January 6, in January 2021.
Issuers are to exclude individuals who discontinue their coverage through the QHP for plan year 2021 and those who are deceased as of the January 2021 anchor date. Vendors must deduplicate the sample frame by the Subscriber of Family Identifier (SFID) before selecting the sample frame to ensure that only one person in each household is surveyed. The sample frame is audited by a National Committee for Quality Assurance (NCQA)-Licensed Healthcare Effectiveness Data and Information Set (HEDIS®)3 Compliance Organization (NCQA-Certified HEDIS Compliance Auditor) to verify that the sample frame conforms to all established specifications. Survey vendors then draw a random sample from each audited eligible sample frame.
For the 2021 QHP Enrollee Survey, CMS continues to propose a sample size of 1,300 enrollees per reporting unit. In the 2019 administration of the QHP Enrollee Survey, reporting units that had sufficient enrollment to produce the full sample of 1,300 enrollees received an average of 241 responses.
Additionally, CMS will continue to allow QHP issuers the option to oversample in increments of 5 percent up to 30 percent (1,690 enrollees maximum) upon approval. QHP issuers may want to oversample in effort to increase response rates, increase the likelihood that a reportable result for QRS is achieved, or improve the reliability and validity of survey results. There is precedent to allow for oversampling in the implementation of CMS Consumer Assessment of Health Providers Surveys (CAHPS®) surveys, such as Medicare Advantage and Prescription Drug Plans (MA & PDP) CAHPS, which permit oversampling at the contract level upon request to CMS. In the 2019 administration of the QHP Enrollee Survey, 95 out of the 214 eligible reporting units opted to oversample.
The QHP Enrollee Survey is conducted by HHS-approved survey vendors that meet minimum business requirements. A similar system is currently used for CMS surveys, including Medicare CAHPS, Hospital CAHPS (HCAHPS), Home Health CAHPS (HHCAHPS), the CAHPS Survey for Accountable Care Organizations, and the Health Outcomes Survey.
Under this model, all issuers that are required to conduct the QHP Enrollee Survey must contract with an HHS-approved survey vendor to collect the data and submit it to CMS on their behalf (45 CFR § 156.1125(a)). CMS is responsible for approving and training vendors, providing technical assistance to vendors, and overseeing vendors to verify that they are following the data collection protocols.
The data collection protocol for the 2021 QHP Enrollee Survey continues to utilize the same data collection protocol, using a mixed-mode methodology that combines internet, mail, and telephone surveys. All sampled enrollees receive a prenotification letter that informs them that they have been sampled for the survey and provides them with information about the survey and how the data collected will be used. The prenotification letter also provides information on completing the survey online, including the website URL and the sample member’s login credentials, which are unique to each sample member.
On Day 6 of fielding, sampled enrollees who provided their issuer with an email address receive a notification email and survey vendors will mail the first mail survey. The mail survey does not include a link to the internet survey. The first mail survey is followed by a reminder email and a reminder letter on Day 19.
A second mail survey and final reminder email are mailed on Day 33. Finally, on Day 54, survey vendors will initiate telephone follow-up calls, making no more than six attempts on varying days of the week and at differing times of the day. Data collection ends on Day 73. Exhibit B1 includes the complete fielding schedule.
Exhibit B1. Data Collection Protocol for 2021 QHP Enrollee Survey
Task |
Timeframe |
Sample enrollees per sampling protocols. |
January – February 2021 |
|
Day 0 (Mid-Late February) |
|
Day 6 |
|
Day 19 |
|
Day 33 |
Initiate telephone follow-up contacts for nonrespondents 21 calendar days after the second survey is mailed. |
Days 54 – 72 |
|
Day 73 (Early-Mid May) |
QHP Enrollee Survey Scoring
CMS calculates composite scores, individual item scores, reliability, and response rates for each reporting unit using the CAHPS macro program for all scoring questions.4 Survey scores are weighted to adjust for the reporting unit’s beneficiary mix and the unequal probability of being selected for the survey across reporting units. The weight variable is calculated according to the formula below to reflect the number of eligible enrollees represented by the sampled enrollee in each reporting unit:
= Total number of records in the deduplicated file for the reporting unit
= Total number of sampled enrollees in the reporting unit
= Number of survey-eligible enrollees covered by the sampled enrollee’s Subscriber or Family ID (SFID) before deduplication
Additionally, CMS case-mix adjusts composite and individual question scores to control for predictable, consistent effects associated with enrollee characteristics that are not under the control of the issuer, but that may affect survey responses. Common case-mix adjusters in survey-based applications include overall health status, age, and education. These factors are necessary to account for biases due to survey respondent tendencies.
CMS will determine the case-mix adjusters annually once the prior year’s QHP Enrollee Survey data are analyzed. A list of the 2019 QHP Enrollee Survey case-mix adjusters is available in Appendix B: 2019 QHP Enrollee Survey Case-Mix Adjusters Variables.
CMS calculates the case-mix adjustment for each question or composite in the following steps:
Perform ordinary least squared regression on the entire person-level QHP data using the n case-mix adjusters as independent variables and the unadjusted score as the dependent variable. Coefficients obtained in this regression estimate the tendency of enrollees across RUs to respond more or less favorably to a given question or composite. To counter the estimated tendency, CMS multiplies the coefficients by negative one (-1) and denotes them as: ,
Calculate reporting unit i’s raw unadjusted score:
Calculate reporting unit i’s mean values for adjuster variables5:
Calculate national-level mean values for adjuster variables:
Calculate reporting unit i’s case-mix adjusted raw score, , using the following formula:
CMS will make every effort to maximize the response rate, while retaining the voluntary nature of the survey. The mixed-mode methodology of the QHP Enrollee Survey provides sampled individuals with numerous methods for completing the survey across a 73-day time period. CMS developed this methodology using best practices within the survey research field, including the Tailored Design Method.6
CMS has also implemented an email outreach protocol, required survey vendors to optimize the internet survey for mobile devices, and added an alternative phone number field to the sample frame to maximize the survey response rate.
The 2014 psychometric test of the QHP Enrollee Survey achieved a 37.2 percent response rate and the 2015 beta test achieved a response rate of 30.9 percent. Exhibit B2 summarizes the QHP Enrollee Survey response rate since beginning nationwide fielding.
Exhibit B2. QHP Enrollee Survey Response Rates
QHP Enrollee Survey Year |
Response Rate |
2016 |
30.0% |
2017 |
26.7% |
2018 |
25.8% |
2019 |
23.2% |
Response rates for the QHP Enrollee Survey are calculated using the Response Rate 3 (RR3) formula established by the American Association for Public Opinion Research (AAPOR). A response is counted as a completed survey if the respondent completes 50 percent of the questions that are applicable to all respondents, excluding the questions in the “About You” section of the survey. Other CMS surveys have reported response rates between 23% and 39%7.
Given that the 2019 QHP Enrollee Survey response rate was below 80 percent, CMS conducted a non-response bias analysis to determine whether non-respondents systematically differ from respondents based on their experience with their health plan per Office of Management and Budget (OMB) guidelines. Similar to previous survey administration years, CMS found that older enrollees (those over the age of 55) continued to be more likely to respond compared to younger enrollees. CMS also found associations between enrollee characteristics like sex and written language preferences and the likelihood of completing the survey. This pattern was observed in the 2014 psychometric test, the 2015 beta test, and the 20162018 nationwide administration years. More detailed findings from the 2019 non-response bias evaluation are included in Appendix A: Nonresponse Bias Analysis of the 2019 QHP Enrollee Survey.
The QHP Enrollee Survey uses core questions and supplemental items from the CAHPS Health Plan Adult Commercial 5.0, CAHPS Health Plan Adult Commercial 4.0, and CAHPS Clinician and Groups 2.0 surveys, as well as Section 4302 of the PPACA. These items have undergone extensive testing and practical use by the Agency for Healthcare Research and Quality, OMB, NCQA, and other users since they were first developed.
In 2014, CMS conducted a psychometric test to evaluate the psychometric properties of the QHP Enrollee Survey in the Exchange population with 30 QRS reporting units (defined by state, issuer, and product type (i.e., HMO, PPO, POS, or EPO) and a Beta Test in 2015. As a result of the psychometric and beta tests, CMS identified changes in the questionnaire items, data collection procedures, and sampling specifications that obtained approval for the 2016 implementation of the survey.
For the 2017 implementation and beyond, CMS added six disability status items to address requirements of Section 4302 data collection standards of the PPACA. CMS removed several question items between 2017 and 2018 due to either low screen-in rates or question being removed from the QRS.
In consultation with the QHP Enrollee Survey Technical Expert Panel (TEP), CMS has continued to review the survey to evaluate the impact of the survey’s length and consider options to reduce the respondent burden. In 2020, CMS conducted focus groups and cognitive testing to support identification of future refinements for the QHP Enrollee Survey. If any refinements are proposed that may impact the 2022 or 2023 administrations of the survey, CMS will submit a revision of this information.
This sampling and statistical plan was prepared and reviewed by staff of CMS and Booz Allen Hamilton. The primary statistical design was provided by Jeffrey Sussman, Ph.D., MPH at Booz Allen Hamilton (Booz Allen) at (301) 825-7155; Michelle Langer, Ph.D. at Booz Allen at (919) 260-7153; and Jennifer Hefele, Ph.D. of Booz Allen at (781) 372-2321.
Given the potential detrimental impact that nonresponse bias can have on survey estimates, OMB requires that all federal surveys that achieve a response rate below 80 percent perform a nonresponse bias analysis (Office of Management and Budget, 2006). This nonresponse bias analysis utilizes demographic information that QHP issuers included in the sample frame about all survey-eligible individuals.
First, the research team calculated the 2019 response rate by enrollees’ sex, age, Census divisions, and spoken and written language preferences and then compared these rates to the national response rate. The results of this analysis are shown in Exhibit B3.
Exhibit B3. 2019 Response Rates by Enrollee Characteristics
Demographic |
Response Rate |
Number Complete |
Denominator |
Sampling Frame |
2019 QHP Enrollee Survey |
23.2% |
51,682 |
293,011 |
297,943 |
Sex |
|
|
|
|
Female |
25.0% |
32,075 |
128,267 |
165,721 |
Male |
20.9% |
19,607 |
93,794 |
132,222 |
Age |
|
|
|
|
18-24 |
11.1% |
1,102 |
9,896 |
16,882 |
25-34 |
14.8% |
5,302 |
35,884 |
58,636 |
35-44 |
17.0% |
5,533 |
32,591 |
48,993 |
45-54 |
21.3% |
9,192 |
43,083 |
58,815 |
55-64 |
32.2% |
29,155 |
90,630 |
109,654 |
65-74 |
40.2% |
1,202 |
2,993 |
4,229 |
75+ |
33.0% |
196 |
594 |
734 |
Census Division |
|
|
|
|
East North Central (IL, IN, MI, OH, WI) |
26.1% |
10,023 |
38,350 |
49,722 |
East South Central (AL, KY, MS, TN) |
25.5% |
1,806 |
7,083 |
9,355 |
Middle Atlantic (NJ, NY, PA) |
23.0% |
6,553 |
28,486 |
38,204 |
Mountain (AZ, CO, ID, MT, NV, NM, UT, WY) |
25.5% |
6,823 |
26,798 |
35,823 |
New England (CT, ME, MA, NH, RI, VT) |
23.1% |
5,211 |
22,548 |
29,671 |
Pacific (AK, CA, HI, OR, WA) |
21.9% |
6,915 |
31,607 |
43,183 |
South Atlantic (DE, DC, FL, GA, MD, NC, SC, VA, WV) |
20.2% |
6,674 |
32,989 |
47,200 |
West North Central (IA, KS, MN, MO, NE, ND, SD) |
25.3% |
3,755 |
14,861 |
18,980 |
West South Central (AR, LA, OK, TX) |
19.8% |
3,922 |
19,796 |
25,805 |
Spoken Language Preference |
|
|
|
|
Chinese |
22.5% |
128 |
568 |
862 |
English |
23.1% |
18009 |
77974 |
102707 |
Missing |
23.2% |
28868 |
124590 |
169103 |
Other |
23.5% |
3573 |
15234 |
20108 |
Spanish |
25.9% |
1104 |
4259 |
5163 |
Written language Preference |
|
|
|
|
Chinese |
22.8% |
121 |
531 |
798 |
English |
22.8% |
14143 |
62115 |
81631 |
Missing |
23.3% |
34622 |
148616 |
200444 |
Other |
23.5% |
1819 |
7727 |
10665 |
Spanish |
26.5% |
977 |
3687 |
Similar to previous survey administration years, response rates were lower among younger enrollees (ages 44 and below) and higher among older enrollees (ages 55 and more). In addition, the response rate was higher among female sampled enrollees.
The Project Team then used a sample-weighted multivariable logistic regression model to estimate the probability of completing a survey and assess whether the respondent characteristics reviewed examined in the previous section were significantly associated with survey response or nonresponse, controlling for the impact of other respondent characteristics.
The team did not adjust the regression with the survey weights because nonresponse was contained to only those in the sample (i.e., those who had an opportunity to respond/not respond). The logistic regression model included predictors for enrollee gender, age, written and spoken language preference, and Census division. The odds ratio and 95 percent confidence interval estimates from the logistic regression are presented in Exhibit B4.
Exhibit B4. Multivariable Logistic Regression – Demographic Associations with Survey Completion
Demographic |
Odds Ratio |
95% Lower CI |
95% Upper CI |
Sex (Ref: Male) |
- |
- |
- |
Male |
Ref |
|
|
Female |
1.300 |
1.274 |
1.326 |
Age (Ref: 18–24) |
- |
- |
- |
18–24 |
Ref |
|
|
25–34 |
1.421 |
1.328 |
1.520 |
35–44 |
1.812 |
1.695 |
1.939 |
45–54 |
2.624 |
2.459 |
2.801 |
55–64 |
5.020 |
4.715 |
5.344 |
65–74 |
5.641 |
5.151 |
6.178 |
75 or older |
5.218 |
4.379 |
6.217 |
Written Language Preference (Ref: English) |
|
|
|
English |
Ref |
|
|
Spanish |
1.411 |
1.236 |
1.610 |
Chinese |
1.087 |
0.741 |
1.595 |
Other |
1.014 |
0.937 |
1.099 |
Missing |
1.038 |
0.997 |
1.081 |
Spoken Language Preference (Ref: English) |
- |
- |
- |
English |
Ref |
|
|
Spanish |
0.938 |
0.830 |
1.061 |
Chinese |
0.0.707 |
0.487 |
1.025 |
Other |
0.949 |
0.893 |
0.1.010 |
Missing |
0.914 |
0.880 |
0.950 |
Census Division (Ref: South Atlantic) |
- |
- |
- |
South Atlantic (DE, DC, FL, GA, MD, NC, SC, VA, WV) |
Ref |
|
|
East North Central (IL, IN, MI, OH, WI) |
1.400 |
1.352 |
1.451 |
East South Central (AL, KY, MS, TN) |
1.403 |
1.323 |
1.489 |
Mountain (AZ, CO, ID, MT, NV, NM, UT, WY) |
1.352 |
1.301 |
1.405 |
New England (CT, ME, MA, NH, RI, VT) |
1.244 |
1.194 |
1.295 |
Pacific (AK, CA, HI, OR, WA) |
1.075 |
1.036 |
1.116 |
Mid-Atlantic (PA, NY, NJ) |
1.139 |
1.096 |
1.183 |
West North Central (IA, KS, MN, MO, NE, ND, SD) |
1.476 |
1.411 |
1.545 |
West South Central (AR, LA, OK, TX) |
1.121 |
1.072 |
1.173 |
Multiple logistic regression results showed that enrollee characteristics were associated with differences in the likelihood of survey completion, controlling for the effects of other characteristics. Females were 30 percent more likely to complete the survey when compared to males. Across age categories, the estimates suggest a trend such that older plan enrollees were more likely to complete the survey when compared with younger enrollees. Compared to enrollees whose preferred written language was English, enrollees who preferred to write in Spanish were more likely to complete the survey. There were also Census division differences in the estimated associations with survey completion across census divisions. For example, being located in the West North Central division was associated with 47 percent higher odds of completing the survey when compared with being located in the South Atlantic division.
Exhibit B5 provides a list of the 2019 QHP Enrollee Survey case-mix adjusters.
Exhibit B5. Coding of Case-mix Adjusters Variables
Case-mix Variable |
Survey Question |
Variable Coding |
Education |
What is the highest grade or level of school that you have completed? |
1) 8th grade or less 2) Some high school, but did not graduate 3) High school graduate or GED (REFERENCE) 4) Some college or 2-year degree 5) 4-year college graduate 6) More than 4-year college degree |
Overall Health Rating |
In general, how would you rate your overall health? |
1) Excellent (REFERENCE) 2) Very good 3) Good 4) Fair 5) Poor |
Mental Health Rating |
In general, how would you rate your overall mental or emotional health? |
1) Excellent (REFERENCE) 2) Very good 3) Good 4) Fair 5) Poor |
Age |
What is your age? |
1) 18 to 24 2) 25 to 34 (REFERENCE) 3) 35 to 44 4) 45 to 54 5) 55 to 64 6) 65 to 74 7) 75 or older |
Survey Language |
From data collection vendor |
English (REFERENCE) Spanish Chinese |
Survey Mode |
From data collection vendor |
Telephone (REFERENCE) Web |
Chronic Conditions and Medications |
Question 50: Did you get health care 3 or more times for the same condition or problem? Question 51: Is this a condition or problem that has lasted for at least 3 months? Question 52: Do you now need or take medicine prescribed by a doctor? Question 53: Is this medicine to treat a condition that has lasted for at least 3 months? |
1) No care and No medications; Q50 = no and Q52 = no 2) Acute care and/or Acute medications; Q50 = yes and/or Q52 = yes. 3) Chronic care or chronic medications; (Q50 = yes and Q51 = yes) OR (Q52 = yes and Q53 = yes) 4) Chronic care and chronic medications (all 4 questions = yes) |
Received Help Responding |
Did someone help you complete this survey? |
1) Yes 0) No (REFERENCE) |
1 Health Insurance ExchangeSM and ExchangeSM are service marks of the U.S. Department of Health & Human Services.
2 CMS suspended data collection in April 2020 due to the COVID-19 pandemic.
3 Healthcare Effectiveness Data and Information Set (HEDIS®) is a registered trademark of the National Committee for Quality Assurance (NCQA).
4 https://www.ahrq.gov/cahps/news-and-events/events/ahrq-conference-2015/walsh-slides.html
5 Missing values are imputed by substituting the mean adjuster value of reporting unit i.
6 Dillman, D., Smyth, J., & Christian, L. (2014). Internet, phone, mail, and mixed-mode surveys: The tailored design method (4th ed.). Wiley.
7 This is the range of each CMS surveys’ publicly-reported response rate. Different CMS surveys calculate and report their offical response rates using different AAPOR formulas based on each survey’s fielding methodology.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Tuchman, Hallie [USA] |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |