Supporting Statement, Part B OMB/PRA Submission for Implementation of the Medicare Prescription Drug Plan (PDP) and Medicare Advantage (MA) Plan Disenrollment Reasons Survey
Contract Number: GS-10F-0275P
Task Order Number: 75FCMC20F0033
January 24, 2023
Beth Simon, Contracting Officer Representatives
HHS/CMS/CM/MDBG/DCAPP
7500 Security Boulevard, Baltimore, MD 21244 [email protected]
TABLE OF CONTENTS
Collection of Information Employing Statistical Methods 3
1. Respondent Universe and Sample 3
2. Information Collection Procedures 9
3. Methods to Maximize Response Rates 9
4. Tests of Procedures or Methods 9
5. Statistical and Questionnaire Design Consultants 10
ii
The Medicare Prescription Drug Plan (PDP) and Medicare Advantage (MA) Plan Disenrollment Reasons Survey sample of beneficiaries is designed to be representative of the population of beneficiaries who disenroll voluntarily from their PDP or MA contracts during a calendar year.
The survey design is affected by several issues, which are detailed as follows along with the approach used to address each issue:
Because the disenrollment reasons and care experiences reported by disenrollees may vary over the course of the year, survey respondents should be representative of the distribution across months of disenrollment from each contract. The distribution between December disenrollees (which includes Annual Election Period (AEP) disenrollees who mainly are not dually eligible (DE) for Medicare and Medicaid) and disenrollees from other months must be taken into account.
Approach: The allocation of each contract’s sample to December versus other months is based on distributions predicted from disenrollment experience in the same contract in previous years.
The distributions for subgroups of disenrollees defined by beneficiary and program characteristics representing needs and resources (i.e., Medicare-Medicaid dual eligibility) should be representative of the population of disenrollees. Based on analysis of the disenrollment data and survey data, DE is a key characteristic that requires consideration in sampling, given significant differences in the percentage of duals across different contracts, differential response rates between duals and non-duals, and different reasons cited for disenrollment as compared with non-duals.
Approach: Sampling is stratified by dual-eligibility status within contracts.
Each month’s sample allocation should be calculated as soon as possible after the month’s disenrollment counts become available, so that fielding can proceed with a minimum of recall bias and loss of saliency. Consequently, each month’s allocation must be calculated before disenrollment counts from later months are known.
Approach: Sample allocations by month are calculated based on previousyear distributions for the stratum.
Response and completion rates vary dramatically across sampling strata, reflecting differences by coverage type, month, and dual status.
Approach: Sampling rates incorporate a factor to partially compensate for predicted nonresponse.
While the primary purpose of the survey is to assess reasons for disenrollment by contract, the data are also used for estimates for subgroups of Medicare beneficiaries that cut across contracts; variability in weights due to low sampling rates in large contracts may reduce the efficiency of subgroup estimates.
Approach: A floor sampling rate is established that guarantees some lower bounds on effective sample size for subgroup analyses.
The primary objective of the survey is to compare “reasons” and performance assessment responses across contracts. However, the optimal design for these comparisons is not optimal for other objectives such as estimation of means of these measures for each contract or giving proportional weight to opinions of beneficiary subgroups (e.g., by race/ethnicity, by dual vs. non-dual status).
Approach: CMS uses a compromise design that was adapted to perform reasonably well for multiple objectives.
In short, the sample design strategy for the disenrollment survey is as follows:
first, a target sample size for the calendar year is established. The survey contractor estimates models from earlier data to project likely shares of disenrollment in a 2×2 stratification by dual eligibility crossed with December vs. all other non-December months of disenrollment, as well as response rates by contract in each of these cells. Sample is allocated to these 4 cells and then the non-December sample allocations are allocated across the remaining 11 months in proportion to historical disenrollment distributions. The sample allocation in each of these steps is a function of the distribution of disenrollments and variation in predicted rate of completed responses.
The survey contractor analyzed monthly disenrollment data from 2019-2021 to identify patterns in monthly disenrollment to inform the sampling approach used to allocate the total sample across the 12 months of a survey year. The table below shows the percentage of total disenrollees in each year by month for 2019, 2020 and 2021.
Percentage of Total Disenrollees in Each Year by Month
|
|
Year |
|
Month |
2019 |
2020 |
2021 |
1 |
5.37% |
6.42% |
6.96% |
2 |
4.13% |
4.78% |
5.83% |
3 |
4.20% |
4.29% |
6.71% |
4 |
3.35% |
2.55% |
4.37% |
5 |
2.87% |
2.64% |
3.56% |
6 |
2.80% |
3.42% |
3.98% |
7 |
3.20% |
3.01% |
3.88% |
8 |
2.81% |
2.71% |
3.77% |
9 |
3.16% |
3.07% |
3.57% |
10 |
1.65% |
1.37% |
1.31% |
11 |
1.13% |
0.89% |
0.90% |
12 |
65.34% |
64.85% |
55.16% |
Year |
2,290,572 |
3,452,162 |
5,742,734 |
About half of all disenrollments are from PDP contracts, and the distributions of PDP and MA disenrollments differ only moderately, with a slightly higher percentage from PDPs contracts in December than other months. The distribution across months of the year is similar but not identical in 2019, 2020 and 2021. Historically in each year, about 60-65% of disenrollments occur in December, traditionally a heavy month for disenrollment coinciding with the Annual Election Period (AEP). When monthly proportions are weighted equally by contract (corresponding to approximately equal sample sizes per contract and excluding the “small” contracts), the mean December proportion drops slightly, reflecting the greater concentration of disenrollments from large-disenrollment (especially PDP) contracts in December. Much greater variation exists at the level of individual contracts. This suggests that a sampling plan that assigns fixed sample proportions to all contracts in each month is likely to be far from optimal.
Sampling targets are expressed in terms of the planned number of responses per contract/period/DE-status cell (where “period” is January-November or December), to facilitate explicit calculations of required sample sizes using predicted response rates that could be modified as new information was obtained. CMS, working with its survey contractor, established target response totals at 75 per year for MA contracts and 150 per year for PDP contracts, to achieve adequately reliable contract-level estimates (reliability of 0.70 or higher) of reasons for disenrollment. The target number of response for MA results in composite measures with contract-level reliability of 0.70 or higher for most of our composite measures for MA contracts (all but “disenrolled due to prescription drug benefits and coverage”). Similarly, the PDP respondent target results in composite measures with contract-level reliability for PDP contracts of 0.70 or higher for the “disenrolled due to financial reasons” and “disenrolled due to problems getting information and help from the plan” composites, while coming close to 0.70 reliability for the “benefits and coverage” composite for most PDPs.
Beginning in December 2016, disenrollee records used in the sample design and sample draw included a flag for DE status. Consequently, by early in 2018 the survey contractor had a full year (2017) of historical data available for analysis that included DE status and could apply the findings to stratify the design (and eventually the analysis) by DE. Dual eligibility is strongly associated with the probability of nonresponse, as described in the next paragraphs, so stratification by DE can reduce variance.
The distinction between December disenrollments and the other months (JanuaryNovember) is particularly important for several reasons. December disenrollees represent the AEP in which nondual beneficiaries are free to disenroll between October 15th and December 7th and switch to another contract or none at all (i.e., switch to traditional Medicare or FFS); dually-eligible beneficiaries can also switch contracts during the AEP (where all disenrollments between October 15th and December 7th show up in the December disenrollment file), but make up a smaller proportion of disenrollees (32.4%) compared with non-duals in this month (70.3%). Consequently, around 60% of disenrollment takes place during the AEP, giving the December sample design major influence on total sample sizes for each contract and overall. Furthermore, the proportions of dual eligibles and the DE/non-DE differences in response rates are very different in December versus other months. As the last month of the calendar-year survey reporting period, the December sample draw is the last opportunity to increase sample size for contracts where the sample for earlier months did not attain targets and productively absorb sample left over from earlier in the year.
As a compromise, the survey contractor oversamples DE cases relative to non-DE cases in the same contract and month by the square root of the ratio of anticipated response rates. By an extension of the standard Neyman formula0 for optimal allocation to accommodate unequal costs per unit, this is the sampling rate that would be optimal if our final analysis were weighted by sampling stratum. With this allocation, the expected weight ratio is also the same square root. This allocation procedure shifts sample toward the low-response-rate strata, but not as much as would be shifted if distribution of respondents matched the population distribution exactly. A similar procedure is followed for December versus other non-December month disenrollments.
Response rates to the disenrollment reasons survey historically have varied substantially across contracts, even after controlling for DE status and month of disenrollment, leaving some contracts with achieved samples below targets. To reduce the resulting variation in responses per contract and month weights, after target sample sizes are calculated for a month, the survey contractor adjusts them for differential response rates by contract. This is possible because these rates are well correlated (r>0.85 in some studies, but without control for DE fraction) from one year to the next. Each provisional sample target is multiplied by (RM)/R where RM is the mean of contract response rates and R is a particular contract’s projected response rate. The survey contractor projected these rates through a multivariate-outcome multilevel logistic model, using actual 2020 response rates to predict 2021 response rates. This model implements “shrinkage” of noisy observed rates toward the mean. The survey contractor also constrained predicted response rates to lie between 60% and 166% of the average response rate and assumed R=RM for new contracts for which there is no previous response data. This helps to equalize the number of responses, and hence reliability, across contracts, while placing some constraints on excessive allocation of sample to contracts with low response rates.
In 2021, the survey contractor multiplied the intended target for each contract by the fraction of disenrollment in that month overall in 2020. When the number of disenrollments for a contract-month was less than this allocation, the unfilled portion of the sample allocation was either filled from unused cases in the same month with the opposite dual-eligible status (dual-eligible disenrollees substituted for non-duals, or the reverse) or carried forward to the next month.
Contracts that terminate or consolidate into another contract are dropped from the sampling frame when this information became available during the course of the year (typically mid-year); the survey team conducts monthly checks of the CMS plan information file to identify contract terminations and consolidations.
For each contract, the initial target sample for each month by DE status is calculated as described above. As sampling unfolds month by month, the survey contractor applies the following additional constraints: (1) the sample cannot not be larger than the number of disenrollees eligible for sampling in the month, (2) floor sampling rates are applied to guarantee that sampling rates are not be too low in contracts with large numbers of disenrollees, to maintain accuracy of national overall and subgroup estimates, (3) when a contract fails to meet its target for a month because it has too few voluntary disenrollees, the unmet sample allocation is transferred to the next month, and (4) fractional sample allocations are rounded unbiasedly to integral sample sizes. In December, the survey contractor assesses whether the unused part of the total sample allocation for each contract is adequate to sample for the intended December sample size and supplements the sample for some contracts for which the shortfall threatened the accuracy of the entire sample.
The basic sampling design calls for roughly equal numbers of completed responses per contract (excluding those with very small numbers of disenrollees), which implies that the sampling rate in a very large contract is lower than in a smaller one. Such disproportionate sampling implies inefficiency in estimates for domains that cut across contracts, such as national breakdowns by race/ethnicity, contract type, etc.; that is, estimates are less precise than they would have been if a sample of the same size had been drawn with equal probabilities by sampling at the same rate from every contract. This inefficiency can be controlled by adding sample to bring the lowest sampling rates up to a higher floor.
In 2021, the survey contractor set an average floor sampling rate of 0.02% percent in PDP contracts and 0.05% in MA-PD contracts.
Sample members from each of twelve monthly disenrollment periods (one disenrollment period per month) receive –
A pre-notification letter (attachment I) indicating the member would receive a survey;
A mailed survey with a cover letter explaining the purpose of the survey
(attachment II) and a postage-paid return envelope;
A second mailed survey for nonrespondents with a follow-up cover letter (attachment III) and a postage-paid envelope.
Puerto Rican members received Spanish translations of all letters, surveys and pre-alert letters. Beneficiaries living on the mainland can request a Spanish-language version of the survey.
The survey contractor mails the first-wave survey approximately three days after sending the prenotification letter, and then mails the second survey four weeks after mailing the first-wave survey to any nonrespondents. The average survey fielding time for each monthly disenrollment period is approximately two months, and it occurs as close in time to the point of disenrollment to maximize response rates and reduce recall bias.
An annual response rate of ~20.3% is expected, based on recent experience with the disenrollment survey. The survey contractor will employ two mail survey waves to minimize nonresponse and consider other methods (such as sending written surveys in both English and Spanish, and phone follow up of non-responders) if necessary and if project budgets allow.
No tests of procedures or methods will be undertaken as part of this data collection.
The survey, sampling approach, and data collection procedures were designed by the RAND Corporation under the leadership of –
Cheryl Damberg, Ph.D.
Principal Senior Researcher
RAND Corporation
1776 Main Street
PO Box 2138
Santa Monica, CA 90401-2138
Data will be collected by the survey vendor CSS Research under the direction of –
Jeff Burkeen
CSS Research
1625 K Street NW, 8th Floor Washington, DC 20006
0 Khan, M.G.M. & Wesolowski, Jacek. (2018). Neyman-type sample allocation for domains-efficient estimation in multistage sampling. AStA Advances in Statistical Analysis. 10.1007/s10182-018-00340-2.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting statement B_disenrollment reasons survey_draft 01_24_2022 clean for Stephan |
Author | Centers for Medicare & Medicaid Services |
File Modified | 0000-00-00 |
File Created | 2023-08-30 |