Contact Information:
William S. Long
Contracting Officer’s Representative, Medicare Current Beneficiary Survey Office of Enterprise Data and Analytics (OEDA)/CMS
7500 Security Boulevard, Mail Stop Mailstop B2-04-12
Baltimore, MD 21244
(410) 786-7927
[email protected] (410) 786-5515 (fax)
March 22, 2024
B1. Universe and Respondent Selection 1
B2. Procedures for Collecting Information 4
B3. Methods for Maximizing Response Rates and Dealing with Issues of Non-Response 18
B4. Tests of Procedures or Methods 23
B5. Individuals Consulted on Statistical Aspects of Design 23
Attachment 1: Community Materials
Attachment 2: Community Instrument (Baseline and Continuing) and Showcards
Attachment 3: Facility Eligibility Screener
Attachment 4: Facility Instrument (Baseline and Continuing) and Showcards
Attachment 5: Facility Materials
Attachment 6: CAPI Screenshots of Introductory Screen and Thank You Screen
Attachment 7: 2023 Content Cycle Cognitive Testing Report
Attachment 8: 2024 Content Cycle Cognitive Testing Report
Attachment 9: New MCBS Respondent Materials
The revision to this OMB package includes the following modifications to the Community instrument sections:
Add one new item to the Income and Assets Questionnaire (IAQ) about participation in the Supplemental Nutrition Assistance Program (SNAP).
Add twelve new items to the Health Status and Functioning Questionnaire (HFQ):
Five items about the prevalence of bowel incontinence.
Five items about oral-health related quality of life.
Two items about insulin administration.
Add two new items to the Health Insurance Questionnaire (HIQ):
Replace one existing item about Veterans Affairs (VA) health care utilization.
Streamline COVID-19 items in the Community questionnaire and Facility instrument to remove items that are no longer relevant at this phase of the pandemic:
Delete 20 COVID-19 items from the Community questionnaire and change the administration schedule for the remaining COVID-19 items from three times per year to annual.
Delete 35 COVID-19 items from the Facility instrument and change the administration schedule for the remaining COVID-19 items from three times per year to annual.
Include updated respondent materials to increase understanding of the survey and improve participation.
The target universe is current Medicare beneficiaries entitled to hospital and/or supplementary medical insurance and living in the 50 states or the District of Columbia. Both institutionalized and non-institutionalized beneficiaries are represented. Table B.1 summarizes the number of beneficiaries in the target universe based on CMS administrative records through 2022. The seven age groups shown in the table correspond to the primary sampling strata from which the samples for the MCBS are drawn. The age groups are defined by the beneficiaries’ age as of December 31 of the given year for 2017 and later.
Table B.1: Universe Counts Broken Down by MCBS Age Groups (in thousands)
Age Interval |
2017 |
2018 |
2019 |
2020 |
2021 |
2022 |
Disabled <45 |
1,842.08 |
1,791.78 |
1,771.52 |
1,744.56 |
1,715.78 |
1,646.76 |
45 to 64 |
7,076.64 |
6,903.46 |
6,773.12 |
6,641.56 |
6,411.54 |
6,153.78 |
65 to 69 |
15,767.28 |
15,978.62 |
16,368.74 |
16,895.90 |
16,975.40 |
17,149.42 |
70-74 |
13,080.94 |
13,647.66 |
14,322.88 |
14,967.58 |
15,115.86 |
15,278.12 |
75-79 |
9,080.94 |
9,463.14 |
9,820.30 |
10,117.54 |
10,576.94 |
11,296.14 |
80-84 |
6,137.60 |
6,301.04 |
6,441.96 |
6,610.14 |
6,737.94 |
7,098.58 |
85+ |
7,021.14 |
7,001.80 |
7,052.58 |
7,099.28 |
6,902.06 |
6,966.30 |
Total (64 and under) |
8,918.72 |
8,695.24 |
8,544.64 |
8,386.12 |
8,127.32 |
7,800.54 |
Total (65 and over) |
51,087.90 |
52,392.26 |
54,006.46 |
55,690.44 |
56,308.20 |
57,788.56 |
Total (All) |
60,006.62 |
61,087.50 |
62,551.10 |
64,076.56 |
64,435.52 |
65,589.10 |
Source: Universe counts are based on a 5-percent extract of the Medicare administrative records and are computed as 20 times the extract counts.
Notes: Puerto Rico beneficiaries are excluded from counts beginning in 2017 by sample design.
Totals do not necessarily equal the sum of rounded components.
The target sample size of the MCBS varies slightly each year. Most recently, it has been designed to yield 9,691 completed cases providing Cost Supplement data per year (approximately 800-900 disabled enrollees under the age of 65 in each of two age strata, and 1,400-1,700 enrollees in each of five age strata for enrollees 65 and over) from 2023 onwards.
To achieve the desired number of completed cases, the MCBS selects new sample beneficiaries each year (referred to as the Incoming Panel) to compensate for nonresponse, attrition, and retirement of sampled beneficiaries in the oldest panel (referred to as the exit panel) and to include the current-year enrollees, while continuing to interview the non-retired portion of the continuing sample. The Incoming Panel is always added in the Fall round (also referred to as the Baseline interview); the retiring or exit panel occurs in the Winter round (and is the 11th and final interview for all respondents).
Each year, an analysis of non-response and attrition is conducted to determine the optimal sample size for the Fall round Incoming Panel. Through 2009, approximately 6,500 beneficiaries were added to the sample in the Fall (September – December) round each year to replace the exiting panel and to offset sample losses due to non-response and attrition. Beginning in the Fall round of 2010 and continuing through the decade, the number of beneficiaries included in the Incoming Panel sample release was gradually increased to compensate for declining response rates. Beginning in 2020 when interviewing shifted from in-person to telephone due to the COVID-19 pandemic, the Incoming Panel sample size was approximately 15,500. This increase is a reflection of the continued decline in response rates and the additional difficulty of locating respondents via telephone1. The sample size results in over 36,000 interviews completed per year.
Proxy interviews are attempted for deceased sample persons. If data are collected through the date of death, then these cases are counted as completed interviews. Sampled beneficiaries remain in the survey when they are unavailable for an interview in a given round; that is, they are carried forward into the next round. For these individuals, the reference period for their next interview is longer as it covers the period since their last interview. This ensures that there will not be a gap in coverage of utilization and expenditure data. If a sampled beneficiary is not interviewed for two consecutive rounds, they are not scheduled for any further interviews and are removed from case management. Such cases are treated as nonresponding cases.
The methodology for drawing the samples is described later in this document. The number of cases to be selected each year for the Incoming Panel (designated sample sizes) are larger than the targeted number of completes to compensate for non-response, ineligibility, and attrition. In addition, beginning in 2020, more sample has been required to compensate for a switch from in-person interviewing to telephone interviewing and the expected lower response rates associated with that mode. To see an illustration of the extent of the compensation necessary in Fall 2021 Round 91 to achieve the desired number of cases providing annual data, see Table B.2.
Table B.2: Sample Size Needed to Compensate for Initial Non-Response and Ineligibility in the 2021 Fall Round
Table B.2: Sample Size Needed to Compensate for Initial Non-Response and Ineligibility
Age on December 31 of reference year |
Desired average number of cases providing annual data |
Number sampled at Fall 2021 Round 91 |
18-44 |
343 |
1,258 |
45-64 |
332 |
1,602 |
65-69 |
687 |
3,082 |
70-74 |
600 |
2,314 |
75-79 |
603 |
2,402 |
80-84 |
620 |
2,651 |
85+ |
648 |
2,641 |
Total |
3,833 |
15,950 |
Cross-sectional sample sizes for other domains. There are multiple domains of interest in the MCBS, (for example, respondents with end-stage renal disease, persons residing in nursing homes, managed care enrollees, beneficiaries of various race and ethnic backgrounds, Medicaid recipients, and beneficiaries aligned to a provider participating in accountable care organizations). The MCBS will continue to maintain a minimum target of 9,000 completed responses in the annual Cost Supplement file to ensure that analysis can be performed on MCBS data for many domains of interest.
Sample sizes for longitudinal analyses. Beginning in 2018, under the rotating panel design specified for the MCBS, respondents remain in the sample for up to eleven rounds of data collection over a four-year period; prior to 2018, respondents remained in the sample for up to twelve rounds of data collection. The historical response rates and attrition rates observed in the MCBS are used to determine the rotational sample size and configuration of each new Incoming Panel. The rotational sample design attempts to achieve consistency in subgroup sample sizes across all panels comprising a particular calendar year.
Table B.3 (in section B2 below) presents the round-by-round conditional and unconditional response rates as of Round 88 (Fall round of 2020) for the samples (referred to in the table as “panels”) selected in 2014 through 2020. For example, from the bottom part of the table, it can be seen that by the 10th round of data collection for the 2017 panel, 20.7 percent of the 2017 panel were still in a formal responding status (that is, either the sampled beneficiary was alive and still participating in the study or had died but a cooperative proxy was found for the collection of data on the last months of life) or had participated in the survey until death, leaving enough data to estimate the last months of life. For the 2018 and 2019 panels, the unconditional response rates as of Round 88 were 24.5 percent (through the 7th round of data collection) and 32.1 percent (through the 4th round of data collection), respectively. The 2020 panel (the new panel selected in Round 88) had an initial response rate of 41.9 percent in its first round of data collection.
Round 88 (Fall 2020) is the latest round for which MCBS data have been fully processed. There were 2,269 interviews successfully completed at Round 88 with still-living members of the 2017 panel. For brevity, we refer to these 2,269 interviews as “live completes.” For the 2018 and 2019 panels there were 2,645 and 3,521 live Round 88 completes, respectively. For the first round of data collection for the 2020 panel, there were 6,379 completes at Round 88.
The MCBS has used a variety of techniques to maintain respondents in the survey and reduce attrition. These will be continued and adapted to comply with the time frames for initiating and implementing the continuing sample.
This section describes the procedures used to select the samples for the national survey. It includes a general discussion of the statistical methodology for stratification and rotational panel selection, estimation procedures, and the degree of accuracy needed. This is followed by a presentation of how instrument sections are used to enhance the analytic potential of the MCBS data. Finally, there is a discussion of rules for allowing proxy response.
This section opens with a description of the MCBS sample design. This is followed by a general discussion of the selection of the original and annual new incoming samples and the use of Medicare administrative enrollment data each year to reduce problems associated with duplication of samples across the years.
PSU and Census tract clustering. The MCBS employs a complex multistage probability sample design. At the first stage of selection, the sample consists of 1042 primary sampling units (PSUs) defined to be metropolitan areas and clusters of nonmetropolitan counties. At the second stage of selection, samples of Census tracts are selected within the sampled PSUs. At the third and final stage of selection, stratified samples of beneficiaries within the selected Census tracts are sampled at rates that depend on age group and ethnicity.
The strata used for selection of the PSUs covers the 50 states and the District of Columbia. Since PSUs were selected randomly with probabilities proportionate to size, there are some states without any sample PSUs within their boundaries. Within major strata defined by region and metropolitan status, PSUs were sorted by percent of beneficiaries enrolled in HMOs and/or percent of beneficiaries who are minorities based on data in CMS administrative files. Substrata of roughly equal size were created from the ordered list for sample selection.
In 2014, within the PSUs, a sample of 703 second-stage units (SSUs) consisting of Census tracts or clusters of adjacent tracts was selected. There were several steps in the SSU sampling process. First, an extract of the entire Medicare administrative enrollment data was obtained, and all beneficiaries’ addresses were geocoded to the tract level. A minimum measure of size was used to determine whether a Census tract was large enough (i.e., had enough Medicare beneficiaries) to stand on its own as an SSU or would need to be combined with one or more adjacent tracts. A frame of 24,212 SSUs was then constructed, and a sample of 703 SSUs was selected using systematic probability proportional to size. These SSUs have been used for sampling MCBS beneficiaries since 20143 and were sized to be used for up to 20 years. An additional sample of 339 reserve SSUs was also selected to support an expansion of the sample or the study of special rare populations in future years. To date, these reserve SSUs have not yet been used for sampling for the MCBS.
Table B.3: Conditional and Unconditional Response Rates as of the 2020 Panel for Medicare Current Beneficiary Survey by Interview Round
Conditional Response Rates (%) for Medicare Current Beneficiary Survey by Interview Round
Round |
2014
Panel |
2015
Panel |
2016
Panel |
2017
Panel |
2018
Panel |
2019
Panel |
|
Round 1 |
58.7 |
53.3 |
54.7 |
55.3 |
55.9 |
55.1 |
41.9 |
Round 2 |
*** |
83.2 |
81.4 |
79.9 |
80.9 |
73.4 |
|
Round 3 |
82.1 |
82.7 |
83.9 |
83.1 |
82.2 |
83.5 |
|
Round 4 |
84.1 |
80.0 |
84.2 |
85.1 |
84.7 |
83.9 |
|
Round 5 |
85.9 |
88.3 |
87.9 |
88.1 |
74.9 |
|
|
Round 6 |
81.1 |
88.0 |
87.7 |
85.7 |
89.3 |
|
|
Round 7 |
83.4 |
87.7 |
88.1 |
89.4 |
88.9 |
|
|
Round 8 |
91.1 |
91.5 |
90.9 |
80.3 |
|
|
|
Round 9 |
89.7 |
92.0 |
89.2 |
92.7 |
|
|
|
Round 10 |
90.3 |
91.9 |
93.2 |
91.4 |
|
|
|
Round 11 |
96.2 |
96.8 |
91.4 |
|
|
|
|
Unconditional Response Rates (%) for Medicare Current Beneficiary Survey by Interview Round
Round |
2014
Panel |
2015
Panel |
2016
Panel |
2017
Panel |
2018
Panel |
2019
Panel |
2020
Panel |
Round 1 |
58.7 |
53.3 |
54.7 |
55.3 |
55.9 |
55.1 |
41.9 |
Round 2 |
*** |
44.2 |
44.3 |
43.7 |
44.8 |
40.2 |
|
Round 3 |
48.1 |
31.7 |
38.1 |
37.7 |
37.6 |
37.9 |
|
Round 4 |
40.1 |
32.9 |
33.3 |
33.7 |
34.3 |
32.1 |
|
Round 5 |
35.8 |
31.3 |
29.0 |
28.2 |
26.7 |
|
|
Round 6 |
21.9 |
28.1 |
27.5 |
27.3 |
27.6 |
|
|
Round 7 |
28.4 |
25.6 |
25.5 |
26.2 |
24.5 |
|
|
Round 8 |
27.1 |
23.0 |
21.9 |
21.6 |
|
|
|
Round 9 |
24.6 |
22.7 |
22.1 |
22.7 |
|
|
|
Round 10 |
23.2 |
21.7 |
21.8 |
20.7 |
|
|
|
Round 11 |
23.0 |
21.7 |
20.4 |
|
|
|
|
* The 2014 panel response rate was impacted by several operational design changes recognized during the transition between contractors in 2014, including an extensive CAPI instrument development effort originally considered out-of-scope for transition purposes, the initial need to release a larger 2014 Incoming Panel sample to account for a smaller continuing sample fielded in the fall of 2014, the hiring and training of 100 new interviewers for MCBS data collection, and the decision to extend the Incoming Panel data collection through the release of additional replicates in December 2014, resulting in a shorter data collection period and consequently lower response rate for 2,500 sample members.
*** Not available because the 2015 Winter and Summer rounds (R71 and R72) were combined for data collection in this year only. Again, this was due to transition activities that started in 2014 and were completed in 2015.
‡ In rounds where some cases are intentionally not fielded, unconditional response rates will be lower than they would have been if all eligible cases were fielded. For example, some cases were intentionally not fielded in Summer 2016 (Round 75) and Winter 2018 (Round 80). In Summer 2016 (Round 75), some cases were intentionally not fielded and instead were included in an early case release for Fall 2016 (Round 76). The resulting unconditional response rates for the 2013-2015 panels in the 9th, 6th, and 3rd rounds, respectively, were lower than they would have been had the cases been fielded, but increased again in the subsequent rounds. In Winter 2018 (Round 80), a group of 306 cases was intentionally not fielded as part of a strategic NIR experiment, affecting the 2015 and 2016 panels in their 8th and 5th rounds, respectively. In Winter 2019 (Round 83), a group of 600 cases was intentionally not fielded as part of a strategic NIR experiment, affecting the 2016 and 2017 panels in their 8th and 5th rounds, respectively.
Selection of beneficiaries. As described early, an annual Incoming Panel sample of beneficiaries is selected from the Medicare administrative enrollment data4. This sample is clustered within the selected PSUs and SSUs and is designed to achieve uniform sampling weights within each strata. Beginning in 2015, beneficiaries eligible anytime during the sampling year are also included in the Medicare administrative enrollment sampling frame (referred to as current-year enrollees). Also beginning in 2015, Hispanic beneficiaries living outside of Puerto Rico were oversampled. Nursing home residents are drawn into the sample in exactly the same manner as other beneficiaries residing in the community.
To date, sampling weights have been calculated for each Fall round (1, 4, 7…, and 88) in order to produce the Survey File limited data sets (previously referred to as the Access to Care files), and for each calendar year in order to produce the Cost Supplement limited data sets (previously referred to as the Cost and Use files). In both cases, cross-sectional and longitudinal weights have been calculated. Some questionnaire sections fielded in the Winter or Summer rounds have specific cross-sectional weights calculated for them as well. In all cases, weights reflect differential probabilities of selection and differential nonresponse, and are adjusted to account for overlapping coverage of the panels included in the data files. Replicate weights were also calculated so that users can calculate standard errors using replication methods. In addition to the replicate weights, stratum and unit codes exist on each weight file for users who prefer to use Taylor Series methods to estimate variances.
Besides standard weighting and replicate weighting, another part of the estimation program includes the full imputation of the data sets to compensate for item non-response. Imputation of charges for non-covered services and sources of payment for covered services in the Cost Supplement files have been developed. Beginning with the 2015 data, unit-level imputation was also instituted to compensate for missing initial-round utilization and cost data5 for current-year enrollees. The weighting and imputation of data continue each year.
A broad range of statistics are produced from the MCBS. There is no single attribute of beneficiaries and their medical expenses that stands out as the primary goal of the survey. Thus, there can be no simple criterion for the degree of reliability that statistics for each analytic domain should satisfy. Even with a larger sample size of 14,000 to 15,000 persons, there would be many small domains of interest for which it would be necessary to use modeling techniques or to wait several years for sufficient data to accumulate.
The MCBS will maintain a stratified approach to the selection of the sample. The sample will continue to be clustered by PSU and Census tract-based SSU and stratified by age domain and race/ethnicity; the tract-based SSU approach was an innovation first begun in 2014 which has resulted in greater efficiencies and increased analytic opportunities. We anticipate maintaining a total of 700-900 annual cases allocated to the two younger age categories for disabled beneficiaries who are not yet 65. The two age categories were selected because they indirectly reflect the means by which the disabled person becomes eligible for Medicare. Since the number of disabled sample persons per PSU and Census tract will be small, the effects of clustering on statistical precision should be mild for this subgroup. For example, depending on the prevalence of the characteristic being estimated, the MCBS has achieved standard errors for estimates of percentages ranging from 2-3% or lower for subgroup estimates based on 1,000 respondents.
Since many of the cost and reimbursement statistics derived from the MCBS may be heavily right-skewed (i.e., reflecting the higher end of the cost/reimbursement spectrum to a disproportionate degree), the accuracy may be lower in relative terms but still acceptable. For example, the relative standard error of the mean total Medicare reimbursements derived from the MCBS has generally ranged from 2.0-2.5% for the total sample, and 4.0-8.0% for subgroups.
Each of the age strata for the Medicare sample age 65 and over will be allocated 1,600-2,200 cases, with the oldest stratum (age 85 and over) being allocated about 1,900 cases with oversampling. A major reason for oversampling the very old is to obtain an adequate sample of nursing home stays. Variations in sampling weights across the age strata and clustering within PSU and Census tract will inflate sampling errors, but the resulting effective sample sizes should be adequate for most analyses.
Content and timing of instrument sections.
The primary variables of interest for the MCBS are the use and cost of health care services and associated sources and amounts of payment. While Medicare claims files supply information on billed amounts and Medicare payments for covered services, the survey provides important self-reported information on use of services not covered by Medicare and on payment sources and amounts for costs not reimbursed by Medicare. For both the Community and Facility components, the primary focus of the data collection is on use of services (dental, hearing and vision care, hospital, physician, medical providers, prescription medication and other medical services), sources and amounts of payment, and health insurance coverage. The MCBS interview collects continuous information on these items through thrice-yearly interviews; that is, once a new respondent completes their Baseline interview, they are asked utilization and cost questions each round.
Continuous data on utilization and expenditures are required for a number of reasons. First, several of the distinct expenditure categories involve relatively rare medical events (inpatient hospital stays, use of home health care, purchase of durable medical equipment, and so forth), so limiting the reference period would mean insufficient observations for annual estimates. Second, episodes of medical care often consist of a series of services over weeks or months; data collected several times a year allow examination of the grouping of services and costs around particular episodes of care. Third, payment for medical services often occurs considerably later than the utilization, so collection of complete information about a particular event can often only be obtained sometime after the event occurs.
The administration of the instruments will continue to follow the established pattern of data collection. Baseline interviews will be conducted in the initial interview with new Incoming Panel respondents. This will be followed with 10 interviews to collect utilization, cost and other important topics, referred to as Continuing interviews. Since the Baseline interview always occurs in the last four months of a calendar year, collection of utilization and expenditure data in the second interview means the reference period will always begin prior to January 1st. This creates use and expenditure estimates on a calendar year basis.
The literature (initially reported by Neter and Waksberg in 19646 and confirmed in subsequent research by other analysts) indicates that collection of behavioral information in an unbounded recall period can result in large recall errors. The Incoming Panel interviews covered in this clearance request - Fall 2023 (Round 97), Fall 2024 (Round 100), and Fall 2025 (Round 103) -prepares the respondent for the collection of utilization and expenditure information in subsequent rounds, thus “bounding” the recall period for the next interview. During the Baseline interview, the respondent is provided with a calendar and interviewers emphasize the importance of this tool for use in future interviews. This calendar marks the recall period for the respondent and serves as the means to record utilization as well as a prompt to retain statements and bills.
Content of the instruments, Rounds 98-106.
Nearly all of the instrument sections as currently approved by OMB are unchanged. Table B.4 presents the core and topical sections that comprise the MCBS Community instrument. As shown in the table, the content and order of administration varies based on season of data collection (Fall, Winter, Summer) and the type of interview (Baseline, Continuing). Those sections with an asterisk (*) include a revision contained in this clearance request (either adding or deleting questions). Occasionally an item may be moved from one questionnaire section to another to improve the flow and use of the data, or for other operational or analytic purposes.
Table B.4: Community Instrument Sections and Order of Administration
Section Listed in the order in which the section is administered. |
Type of Section (Core or Topical) |
Season of Administration (Rounds Administered) |
Interview Type (Baseline, Continuing, Both) |
Introduction (INQ) |
Core |
All (Round 98-106) |
Both |
Enumeration (ENS) |
Core |
All (Round 98-106) |
Both |
Housing Characteristics (HAQ) |
Topical |
Fall (Rounds 100, 103, 106) |
Both |
Health Insurance (HIQ)* |
Core |
All (Round 98-106) |
Both |
Mobility of Beneficiaries (MBQ) |
Topical |
Fall (Rounds 100, 103, 106) |
Both |
Preventive Care (PVQ) |
Topical |
All (Round 98-106) |
Both |
COVID-19 (CVQ)* |
Topical |
All (Round 98-106) |
Both |
Health Status and Functioning (HFQ)* |
Core |
Fall (Rounds 100, 103, 106) |
Both |
Nicotine and Alcohol Use (NAQ) |
Topical |
Fall (Rounds 100, 103, 106) |
Both |
Satisfaction with Care (SCQ) |
Core |
Fall (Rounds 100, 103, 106) |
Both |
Cognitive Measures (CMQ) |
Core |
Fall (Rounds 100, 103, 106) |
Both |
Demographics and Income (DIQ) |
Core |
Fall (Rounds 100, 103, 106) |
Both |
Beneficiary Knowledge and Information Needs (KNQ) |
Topical |
Winter (Rounds 98, 101, 104) |
Continuing |
Usual Source of Care (USQ) |
Core |
Winter (Rounds 98, 101, 104) |
Continuing |
Telemedicine (TLQ) |
Topical |
Winter (Rounds 98, 101, 104) |
Continuing |
Chronic Pain (CPQ) |
Topical |
Summer (Rounds 99, 102, 105) |
Continuing |
Income and Assets (IAQ)* |
Core |
Summer (Rounds 99, 102, 105) |
Continuing |
Drug Coverage (RXQ) |
Topical |
Summer (Rounds 99, 102, 105) |
Continuing |
Dental, Vision, and Hearing Care Utilization (DVH) |
Core |
All (Round 98-106) |
Continuing |
Emergency Room Utilization (ERQ) |
Core |
All (Round 98-106) |
Continuing |
Inpatient Utilization (IPQ) |
Core |
All (Round 98-106) |
Continuing |
Outpatient Utilization (OPQ) |
Core |
All (Round 98-106) |
Continuing |
Institutional Utilization (IUQ) |
Core |
All (Round 98-106) |
Continuing |
Home Health Utilization (HHQ) |
Core |
All (Round 98-106) |
Continuing |
Medical Provider Utilization (MPQ) |
Core |
All (Round 98-106) |
Continuing |
Access to Care (ACQ) |
Core |
Winter (Rounds 98, 101, 104) |
Continuing |
Prescribed Medicine Utilization (PMQ) |
Core |
All (Round 98-106) |
Continuing |
Other Medical Expenses (OMQ) |
Core |
All (Round 98-106) |
Continuing |
Statement Cost Series (STQ) |
Core |
All (Round 98-106) |
Continuing |
Post-Statement Cost (PSQ) |
Core |
All (Round 98-106) |
Continuing |
No Statement Cost Series (NSQ) |
Core |
All (Round 98-106) |
Continuing |
Cost Payment Summary (CPS) |
Core |
All (Round 98-106) |
Continuing |
Physical Measures (PXQ)^ |
Core |
Winter (Rounds 98, 101, 104) |
Continuing, Exit Panel Only |
Physical Measures (PXQ)^ |
Core |
Summer (Rounds 99, 102, 105) |
Continuing, All Other Panels |
End Section (END) |
Core |
All (Round 98-106) |
Both |
^Only conducted for in-person interviews.
The Facility instrument collects information that is similar in content to the Community instrument. Table B.5 presents the core and topical sections that comprise the MCBS Facility instrument. As with the Community instrument, the content and order of administration varies based on season of data collection (Fall, Winter, Summer) and the type of interview (Baseline, Continuing). Those sections with an asterisk (*) include a revision contained in this clearance request (either adding or deleting questions).
Table B.5: Facility Instrument Sections and Order of Administration
Section |
Type of Section (Core or Topical) |
Season of Administration (Rounds Administered) |
Interview Type (Baseline, Continuing, Both) |
Facility Questionnaire (FQ) |
Core |
All (Round 98-106) |
Both |
Residence History (RH) |
Core |
All (Round 98-106) |
Both |
Background Questionnaire (BQ) |
Core |
Fall (Rounds 100, 103, 106) |
Baseline |
Health Insurance (IN) |
Core |
Fall (Rounds 100, 103, 106) |
Both |
Use of Health Services (US) |
Core |
All (Round 98-106) |
Continuing |
Expenditures (EX) |
Core |
All (Round 98-106) |
Continuing |
Health Status (HS) |
Core |
Fall (Rounds 100, 103, 106) |
Both |
Beneficiary-Level COVID-19 (CV)* |
Topical |
All (Round 98-106) |
Both |
Facility Questionnaire Missing Data^ |
Core |
All (Round 98-106) |
Both |
Residence History Missing Data^ |
Core |
All (Round 98-106) |
Both |
Background Questionnaire Missing Data^ |
Core |
Fall (Rounds 100, 103, 106) |
Baseline |
^Section only activated and available for administration when critical data points from the FQ, RH, or BQ sections are marked as missing, Don’t Know, or Refused.
The revision to this OMB package includes the following content changes to the Community and Facility instruments.
Add one new item to the Income and Assets Questionnaire (IAQ) about participation in the Supplemental Nutrition Assistance Program (SNAP).
Add twelve new items to the Health Status and Functioning Questionnaire (HFQ):
Five items about the prevalence of bowel incontinence.
Five items about oral-health related quality of life.
Two items about insulin administration.
Add two new items to the Health Insurance Questionnaire (HIQ):
Replace one existing item about Veterans Affairs (VA) health care utilization.
Add one new item about VA health care enrollment.
Streamline COVID-19 items in the Community questionnaire and Facility instrument to remove items that are no longer relevant at this phase of the pandemic:
Delete 20 COVID-19 items from the Community questionnaire and change the administration schedule for the remaining COVID-19 items from three times per year to annual.
Delete 35 COVID-19 items from the Facility instrument and change the administration schedule for the remaining COVID-19 items from three times per year to annual.
Include updated respondent materials to increase understanding of the survey and improve participation.
Add One Item about Participation in the Supplemental Nutrition Assistance Program (SNAP). The MCBS Income and Assets (IAQ) questionnaire is administered once per year during the Summer round. This revision adds one item to the IAQ about SNAP participation. Existing items in the IAQ collect information on financial well-being of Medicare beneficiaries, which is used to support analyses of effectiveness of various Medicare programs in reaching target populations and to improve outreach. IAQ also enriches analysis of inequity in healthcare access and use by providing information that is critical to understanding health outcomes, healthcare use, and spending down to Medicaid eligibility. As such, this section is used to support CMS’ engagement in Executive Orders 139857 and 139888, issued in January of 2021, which called upon agencies to identify and work to redress inequities in their policies and programs that create barriers to equal opportunity and prevent and combat discrimination on the basis of gender identity and sexual orientation, respectively. To support this goal, CMS has released a variety of measures based on IAQ data, in the Financial Well-Being of Medicare Beneficiaries Public Use File (PUF)9. This table package is derived from the MCBS IAQ which currently includes the USA Economic Research Service’s six standard questions on food security10. These questions ask whether respondents may have skipped meals or gone hungry due to lack of money. According to the Financial Well-Being of Medicare Beneficiaries, there are significant differences in food insecurity rates among Medicare beneficiaries. In 2020, 9.8 percent of White non-Hispanic beneficiaries living in the community were food insecure, compared with 25.2 percent of Black non-Hispanic beneficiaries and 28.7 percent of Hispanic beneficiaries. To further support alignment with Executive Orders 13985 and 13988, this revision seeks to add one question to the IAQ from the American Community Survey11 on SNAP participation which will allow CMS to enhance its understanding of food insecurity among Medicare beneficiaries and beneficiary experiences that directly influence health outcomes. The question on SNAP participation, in addition to existing IAQ items about residence in Section 8 housing and receipt of Supplemental Security Income (SSI), will also give CMS a more comprehensive picture of the various non-CMS programs that Medicare beneficiaries rely on and further support CMS’ alignment with Executive Orders 13985 and 13988.
Add Five Items about the Prevalence of Bowel Incontinence. The Health Status and Functioning Questionnaire (HFQ) section is administered once per year in the Fall round. The revision adds five items to the HFQ to measure the prevalence and management of bowel incontinence. According to data from the 2007-2010 National Health and Nutrition Examination Survey (NHANES), over half of noninstitutionalized adults aged 65 and over reported some type of incontinence, including urinary leakage and/or accidental bowel leakage of mucus, liquid stool, or solid stool. About 8% of adults aged 65 and over had moderate, severe, or very severe bowel leakage12. In older adults, incontinence is associated with multiple interacting factors, including chronic conditions such as diabetes or stroke, inadequate fiber or water intake, neurologic and psychiatric conditions, cognitive impairment, and mobility impairment. Use of certain medications and polypharmacy may also contribute to bowel incontinence. Bowel incontinence has serious implications for quality of life, impacting one’s emotional, physical, and economic well-being. Further, it is associated with depression, anxiety, and self-isolation. Incontinence is also a predictor of functional limitations and is associated with an increase in falls, which may result in injuries or hospitalization13.
Although common among older adults, bowel incontinence is not sufficiently discussed in health care settings. A 2018 study showed that most primary care providers screen for urinary incontinence but not bowel incontinence, despite the fact that these two issues can be related14,15. Further, a 2015 study found that less than a third of adults aged 70 and over with bowel incontinence discussed this problem with their primary care physician. Those who had mild symptoms of bowel incontinence had little to no knowledge of the treatments available to them16.
Although the MCBS collects information on urinary incontinence, not enough is known about Medicare beneficiaries with bowel incontinence. To address this measurement gap, several items were sourced from a 2004 Mayo Clinic Study17 and adapted to model the urinary incontinence items in the HFQ to capture prevalence and type of stool leakage. Beneficiaries are first asked if they have experienced several types of bowel incontinence including leaking gas, leaking a small amount of stool, leaking a moderate amount of stool, and leaking a large amount of liquid stool. Beneficiaries who respond affirmatively to any type of bowel incontinence are next asked if they have talked with their health care provider about this issue. These items were refined based on the results of a small cognitive testing effort (see Attachment 7). Incorporating these items on the MCBS starting in Fall 2024 Round 100 will allow CMS to understand the prevalence of bowel incontinence among beneficiaries and develop better awareness and outreach regarding this medical issue.
Add Five Items about Oral Health-Related Quality of Life. This revision adds five items about oral health-related quality of life, the Oral Health Impact Profile instrument, known as OHIP5, to the HFQ. Older adults have a higher risk for poor oral health than any other age group because of insufficient dental insurance coverage, lack of access to oral health care, and prevalence of underlying health conditions that would be best managed by medical and oral health professionals working in tandem to provide coordinated care18. Oral health problems in older adults include untreated tooth decay, gum disease, tooth loss, oral cancer, and chronic diseases stemming from untreated inflammation in the mouth, which can be exacerbated by dry mouth – a common side effect of many medications commonly taken by those aged 65 and older19. There is also substantial evidence that periodontitis is a risk factor for certain systemic diseases, and impaired oral health, including missing teeth and dry mouth, has been associated with mastication and nutritional problems, especially among the elderly, with highly negative effects on their quality of life20.
Being disabled, homebound, or institutionalized increases the risk of poor oral health21. Older adults with the poorest oral health and lowest access to dental care tend to be those who are economically disadvantaged, lack insurance, and are members of racial and ethnic minorities. An Oral Health Surveillance Report from the Centers for Disease Control and Prevention (CDC) found that older non-Hispanic Black or Mexican American adults have two to three times the rate of untreated cavities as older non-Hispanic White adults. Older adults with less than a high school education have untreated cavities and complete tooth loss at nearly three times the rate of adults with at least some college education22.
Since 2019, the MCBS has made improvements to survey content related to oral health, including improving the collection of dental utilization and cost information and adding items related to dry mouth and tooth sensitivity. These improvements allowed CMS to release the 2019 MCBS Report on Dental, Vision, and Hearing Care Services in September 2022, which provides estimates of dental, vision, and hearing care utilization by Medicare beneficiaries as well as comparisons of dental care use by certain sociodemographic characteristics23. They also allowed MCBS to support a variety of data needs associated with CMS Strategic Cross-Cutting Initiatives24. Although the MCBS currently captures some oral health data, it does not provide a comprehensive measurement of a beneficiary’s oral health functioning and quality of life. Measures for oral function, orofacial pain, orofacial appearance, and psychosocial impact make up the four dimensions of the Oral Health-Related Quality of Life (OHRQoL) and are needed to provide a more complete understanding of oral health among older adults and the impact of poor oral health on the overall health of Medicare beneficiaries. The Oral Health Impact Profile (OHIP) is currently the most widely used oral health-related quality of life (OHRQoL) instrument. The ultrashort OHIP5 has at least one indicator for each of the four dimensions and is specifically designed to provide a comparable level of analytic utility as longer scales with only five measures, supporting its content validity25. These items performed well and were easily understood during a small cognitive testing effort (see Attachment 8).
Add Two Items about Insulin Administration. During the Fall round interview, the MCBS asks beneficiaries if they have ever been diagnosed with diabetes (Baseline cases) or if they have been diagnosed with diabetes in the past year (Continuing cases). Beneficiaries who respond affirmatively to either question receive a detailed follow-up series related to diabetes management. This series asks questions such as whether the beneficiary takes insulin, how frequently insulin is taken, how frequently blood sugar is tested, and how frequently the beneficiary checks for foot sores or irritations.
CMS is requesting to expand this current diabetes management series to include two follow-up items related to diabetes management that would be administered annually in the Fall round to beneficiaries who report having diabetes and taking insulin (estimated to be approximately three percent of beneficiaries). The first new follow-up item is sourced from the Diabetes Self-Management Questionnaire26 and asks if the beneficiary administered insulin using a syringe, insulin pen, insulin pump, and/or inhaler. The second new follow-up item is adapted from an existing MCBS item and asks if the beneficiary had trouble paying for insulin in the past 12 months.
The
purpose of this modification is to provide time-sensitive information
on diabetes management that will help CMS and the HHS Office of the
Assistant Secretary for Planning and Evaluation (ASPE) evaluate
changes in diabetes self-management related to insulin-related
provisions in the Inflation Reduction Act (IRA) of 2022. Fielding the
follow-up items in Fall 2024 will provide data points that closely
corresponds to the time when IRA’s cap on out-of-pocket costs
for insulin first took effect and therefore will improve CMS’
ability to observe changes in diabetes management trends since the
implementation of IRA. These changes will also help to satisfy OMB’s
terms of clearance specified in the approval of the 2024 MCBS
questionnaire that requested CMS to add new items to the survey
related to beneficiary-centric IRA provisions. The modified items
will be integrated into the MCBS questionnaire beginning in
Fall
2024 Round 100.
Add Two Items about Veterans Affairs (VA) Health Care Enrollment and Utilization. There is considerable interest in proposals to change Medicare in ways that can slow the growth of program spending. In conjunction with Medicare, supplemental insurance affects the point-of-services price of care to the beneficiary and, thereby, influences the beneficiary’s access to health services. Supplemental insurance also influences the amount of money spent by the Medicare program, as it lowers or eliminates financial barriers to care. Because Medicare is not a fully comprehensive insurance program, the availability and coverage provided by supplementary insurance and its influence on the use and cost of care will be important for assessing Medicare expenditures and policy changes27.
The MCBS collects information on all sources of supplemental insurance. This information is used by policymakers, including the Assistant Secretary for Planning and Evaluation (ASPE) Office of Health Policy and The Board of Trustees of the Federal Hospital Insurance and Federal Supplementary Medical Insurance Trust Funds, to evaluate enrollment trends, estimate the number of beneficiaries who may be affected by program changes, and understand the impact of additional sources of coverage on Medicare expenditures and out-of-pocket costs paid by beneficiaries28,29. Although the MCBS currently includes a question about whether a beneficiary has received health care services through the VA, this does not capture beneficiaries who may have had coverage through the VA but had not used VA health care services during the reference period. This method of collecting data, then, underestimates the number of beneficiaries covered by supplemental insurance available through the VA and limits policy makers knowledge of the alternatives available to Medicare beneficiaries to supplement their Medicare plans.
To address this gap, two items were adapted from the 2022 National Health Interview Survey (NHIS) to capture both VA health care utilization and enrollment in the Health Insurance Questionnaire (HIQ) starting in Fall 2024 Round 10030. These items will be administered to beneficiaries who previously reported serving in the Armed Forces of the United States in the Enumeration Summary (ENS) section of their MCBS Baseline interview. The first item, which asks if the beneficiary received any care at a VA facility or any other health care paid for by the VA in the last 12 months, will replace an existing MCBS item about receipt of VA health care services, thereby creating better alignment between the MCBS and other federal surveys. While the existing MCBS item has the same intent, the parallel item from the NHIS extends the question to explicitly ask about receipt of care in non-VA facilities. The second item is new to the MCBS and asks beneficiaries who did not report VA health care utilization in the past 12 months if they have been enrolled in VA health care in the past 12 months. Incorporating these items on the MCBS starting in Fall 2024 Round 100 will allow CMS and stakeholders to understand the prevalence of VA coverage among Medicare beneficiaries, regardless of VA health care utilization.
Streamline the COVID-19 Questionnaire (CVQ) in the Community Questionnaire. The COVID-19 Questionnaire (CVQ) currently consists of about 30 questions taking about two minutes to administer each round. Starting in 2024, the CVQ section will be further streamlined, resulting in a deletion of about 20 items that are no longer relevant. Also, the administration schedule for the CVQ will be reduced—instead of asking the section three times a year (e.g., every round), it will be administered once per year. The 2024 CVQ section will focus on COVID-19 vaccination, testing, diagnosis, symptom severity, and prevention. These items have been realigned with other federal surveys that continue to collect data about the COVID-19 pandemic31,32,33. Some items are also slightly re-worded to make them more appropriate and less burdensome for administration in 2024:
The COVID-19 testing series has been consolidated to ask about all types of COVID-19 tests in one series rather than asking separately about viral testing and antibody testing. Further, instead of asking a series of follow-up items about each type of COVID-19 test, including test result, wait time, and copayment, the revised series has been reduced. The beneficiary will first be asked if they have been tested for COVID-19 in the last year. If yes, they will be asked which type of test they received (e.g., via nasal swab, at-home test, or blood test) and the test result.
Rather than collecting full details for each COVID-19 vaccination dose, including vaccine date, manufacturer, and vaccine site, the revised CVQ will simply ask how many COVID-19 vaccine doses have been received by the beneficiary to convey an overall metric for vaccine uptake.
Instead of asking if the beneficiary has ever worn a mask as a COVID-19 prevention measure, the revised CVQ will ask how often the beneficiary masks in public to provide a more analytically useful measure of prevention behavior.
Fourteen total questions are being retained in the CVQ section. Only three items will be asked of all respondents; based on programmed skip logic, the remaining 11 items are only asked to a smaller number of respondents as a follow on to a previous question. Based on the results of timings tests, the streamlined CVQ is expected to take approximately 1.5 minutes on average to administer once a year in the Winter round. The removal of certain COVID-19 items and the change in the section’s administration schedule will yield a significant reduction in respondent burden while still capturing information on topics of enduring importance related to COVID-19.
Streamline the COVID-19 Beneficiary (CV) section in the Facility Instrument and remove the COVID-19 Facility-Level (FC) section. There are two sections in the Facility Instrument that have measures of COVID-19. The CV section, which is administered each round, asks whether beneficiaries have had a COVID-19 test, diagnosis, or vaccination since the last interview, and the FC section, which is administered once per year in the Fall round asks about the availability of telemedicine services inside and outside of the facility, facility measures to prevent and control the spread of COVID-19, and changes in mental health services due to the pandemic. Starting in 2024, the CV section will be reduced to a series on vaccination status to be asked annually in the Winter round. In addition, the FC section will be removed from the Facility instrument in its entirety. This will reduce the COVID-19 content in the Facility Instrument from about 40 items to five items.
Update Respondent Materials. To maximize outreach, CMS is adding one new item to the suite of existing respondent material (see Attachment 9). This new material is designed to increase understanding of the survey, particularly the Physical Measures (PXQ) questionnaire section, and thus improve participation. This material is used as a resource for interviewers when they encounter questions that beneficiaries ask about physical measures.
Interviews with Incoming Panel sample persons in community. In the Fall rounds (Round 100, 103, 106), all newly selected beneficiaries will be mailed a Community Advance Letter from the Centers for Medicare and Medicaid Services (Attachment 1). Advance mail materials have been developed to accommodate interviews conducted in person and phone. Outreach with Incoming Panel beneficiaries is conducted by telephone and in-person visits following a locating process to identify viable phone numbers for beneficiaries. Beneficiaries for whom a phone number cannot be located will be prioritized for in-person visits.
When conducting in-person interviews, field interviewers will carry copies of the advance materials (e.g., advance letter, frequently asked questions) for respondents who do not recall receiving them in the mail, as well as a copy of the MCBS Community Brochure and At the Door Sheet (Attachments 1). Additional reminder letter, thank you letters acknowledging participation, and tailored refusal conversion letters provide additional ways to build rapport and gain cooperation with beneficiaries and further improve response rates.
The Community interviews (Rounds 98-106) will be administered to the respondent or a designated proxy using a CAPI program on a laptop computer. Attachment 2 includes a copy of all questionnaire sections administered in the Baseline interview, the continuing interview, and the Showcards used by the interviewer to assist in the interviewing process.
At the completion of the Baseline interview (Rounds 100, 103, 106), each new respondent is provided with a MCBS calendar (Attachment 1), on which he or she is encouraged to record health care events. The same calendar is provided to all Continuing Community respondents on a yearly basis. The calendar is provided either during an in-person interview or by mail following a phone interview.
Interviews with sample persons in institutions. Regardless of mode of administration, all Facility interviews are administered to facility staff by field interviewers who use a CAPI program on a laptop computer. For all facility residents, the Facility Eligibility Screener is administered each time a respondent is found to have entered a facility, or in the case of Baseline respondents, is currently in a facility (Attachment 3). The Facility instrument to be used in Rounds 98-106 is shown in Attachment 4.
An advance letter is sent to all facilities prior to an interview contacting the facility for an interview (Attachment 5). CMS has also developed additional materials to gain cooperation including providing information on how to prepare for the interview, introducing the study to staff at third-party billing offices who may provide additional survey responses, and thanking the facility staff for participation.
Some facility administrators will require consent of the sample person or a next of kin before releasing any information. The data collection contractor will offer to obtain such written consent, using the Resident Consent Form, and Next of Kin Consent Form. These forms as well as a HIPAA letter are included in Attachment 5.
For Community respondents, the preferred mode is self-response. Respondents are asked to designate proxy respondents. These are individuals who are knowledgeable about the respondent’s health care. In the MCBS, only those individuals who are designated by the respondents can serve as proxy respondents. In addition, a proxy is utilized if a beneficiary had been reported as deceased during the current round’s reference period or if a beneficiary who was residing in the community in the previous round had since entered into a long-term care facility. Proxy interviews are only used for the Community interview, as the Facility interview is conducted with a staff member located at the facility.
Upon screening a facility where a sampled beneficiary is determined to be living, the interviewers determine the appropriate staff at the facility best able to respond. MCBS interviewers do not interview residents in a facility. Instead, interviewers are trained to determine and seek out the appropriate staff for the interview. If a respondent is incarcerated, we do not seek response. Other institutions will be treated on a case-by-case basis.
The sample for the MCBS is a heterogeneous population that presents a unique challenge for maximizing response rates. The survey selects respondents from two Medicare groups—those age 65 and over and those younger than 65 who have disabilities. Increasing age, poor health or poor health of a family member are common reasons for refusal. On the other hand, older persons are the least mobile segment of the population and thus, for a longitudinal survey, this population has a reduced likelihood of failing to locate the respondent.
Because this is a longitudinal survey, it is essential that we maximize the response rates. To do so, data collection staff undertake an extensive outreach effort each round. This includes the notification of government entities about the survey including CMS regional offices and hotline, carriers and fiscal intermediaries, and Social Security Offices, national organizations including AARP and various community groups (e.g., social service and health departments, home health agencies, state advocates for the elderly, and area agencies on aging). These efforts are undertaken to answer questions or concerns that respondents may have to increase the likelihood that respondents would participate in the MCBS and remain in the survey panel.
Further, with the integration of telephone outreach and interviewing, additional methods have been introduced to maximize participation among new Incoming Panel members. Prefield locating activities (including electronic database searches using LexisNexis® Accurint® and TransUnion® TLOxp batch processing) are used to verify or update selected sample addresses and to obtain telephone numbers when available. Additional mailings include reminder letters and use of FedEx services, along with intensive locating and tracing efforts to maximize response.
Efforts to maximize response rates include: 1) informing authoritative sources to whom respondents are likely to turn if they question the legitimacy of the MCBS; 2) giving interviewers resources to which they can refer to reassure respondents of the legitimacy/importance of the survey; 3) generally making information about MCBS available through senior centers and other networks to which respondents are likely to belong or reach out (such as the 1-800-Medicare hotline); and 4) mailing reminder letters to respondents to encourage their participation in the survey.
CMS intensively monitors both unconditional and conditional response rates. The unconditional response rate is the percentage of sample that were released during the fall round of the selection year and responded to the survey in a given year. The unconditional response rates, also called cumulative response rates, use the original selected sample size as the baseline in their calculation. Conditional response rates are the percentage of sample that were eligible at the beginning of the Fall round of a particular year and responded during that year. Conditional response rates use the sample who are eligible to participate in the survey (a subset of the sample released in the Fall round of the selection year) as the baseline in their calculation. In other words, they are conditioned on eligibility. Both indicators are very important for understanding trends about response rates and where interventions should optimally be targeted. These trends are monitored over the full historical span of the survey, providing important insights in changes to response rates over time.
Response is also tracked throughout each round by a host of key indicators including panel, HHS region, age, race, ethnicity, residential status (community or facility), current year Medicare enrollees or not-current year enrollees. In addition, performance by field interviewers is also tracked to identify any staff who need additional training or support to improve their interview completion rates. CMS continually analyzes response rates, particularly for the subpopulations with the lowest propensity to respond and is fully committed to finding ways to stem declining response rates.
In addition to outreach, the following efforts remain in place to maintain a sense of validity and relevance among the survey participants.
An advance letter is sent to both sampled beneficiaries and facility administrators from CMS with the CMS Survey Director’s signature. This includes an informational brochure answering anticipated questions. Reminder mailings are also sent to encourage response (Attachment 1 and 5).
A handout with Privacy Act information and an appeal to participate is given to the respondent at the door by the interviewer (Attachment 1).
Interviewer training emphasizes techniques and approaches effective in communicating with the older and disabled population and ways to overcome difficulties respondents may have in participating.
Individualized non-response letters are sent to respondents who refuse to participate (example included in Attachment 1). These letters are used when deemed appropriate by the field management staff.
NORC field management staff are specialized to follow up with respondents who express concerns about participating due to privacy or confidentiality questions.
Proxy respondents are sought for respondents unable to participate for themselves in order to keep respondents in the survey over the life of the panel.
Non-respondents are re-contacted by a refusal conversion specialist.
A dedicated project email address ([email protected]) and toll-free number (1-844-777-2151) is available to answer respondent's questions. This information is contained on various materials provided to the respondent.
An MCBS website (mcbs.norc.org) contains information for respondents on the project and has recently been updated to include a short explanatory video. Respondents are also informed about the CMS MCBS Project Page – www.cms.gov/mcbs
Respondents receive an annual MCBS newsletter, which includes information about the survey as well as seasonal topics such as winter safety tips for seniors. Attachment 1 contains an example of a recent newsletter.
Whenever possible, the respondent is paired with the same interviewer throughout the survey. This maintains rapport and establishes continuity of process in the interview.
Interviewers are trained to utilize personal touches such as thank you notes and birthday cards to maintain contact with respondents.
A Community Authority Letter (Attachment 1) is sent to community organizations in advance of the Fall rounds (Rounds 97, 100, 103) to inform community representatives, such as state-level departments of aging, insurance, and state senior Medicare patrol officers, about the MCBS.
A language insert will be included with the Community Advance Letter for the Incoming Panel sample to provide an explanation of the survey for respondents who do not speak English or Spanish (Attachment 1).
In Fall 2023, OEDA and the CMS Office of Minority Health are also piloting enhanced outreach to sampled Medicare beneficiaries who identify as Hispanic, Black, or Asian. These efforts include updating interviewer training materials to include additional content on culturally specific issues or concerns respondents may have as well as tailoring outreach and contacting strategies, with an emphasis on in-person interviewing with sample members predicted to be Hispanic, Black, or Asian. CMS will closely monitor the success of outreach and interviewing strategies and data collection progress amongst underserved minority groups. Analysis of response rates, level of contacting effort required to complete interviews, modes of outreach and mode of completed interviews will inform the feasibility of future efforts to expand the data available for underserved Medicare beneficiaries.
A non-response bias analysis for the MCBS is conducted every three years. The most recent non-response bias analysis for the MCBS was conducted based on the 2018 Panel and was released in the final 2018 Methodology Report34. This analysis also included beneficiaries who participated in COVID-19 surveys. While non-response is carefully monitored every year, a complete non-response bias analysis is updated every three years to ascertain trends both annually and for subpopulations. The next non-response bias analysis for the MCBS is underway. It will be conducted based on the 2021 Panel and released with the forthcoming 2021 Methodology Report in the Fall of 2023.
In the most recent non-response bias analysis, Fall 2018 respondents and non-respondents were compared on various measures, including frame characteristics, Medicare claims payments, and chronic conditions, in order to identify areas of potential bias. The effects of weighting on potential nonresponse bias were also investigated: unweighted and weighted proportions of respondents across select frame-level attributes were compared to corresponding benchmarks. Significant differences were found among the demographic, claims payment, and chronic conditions variables. While nonrespondents appeared more likely to be female and older, and slightly more likely to fall into Missing or Other/Unknown race categories, demographic differences were not large. Significant differences were also found across various claims payment measures but were minimal and not consistently in the same direction (i.e., sometimes respondents had higher claims payments in certain settings, and other times non-respondents did). The same was true for beneficiaries with chronic conditions: Incoming Panel respondents in the Fall round were more likely to have a few of the chronic conditions than nonrespondents, but in later rounds and for the continuing panels, nonrespondents were more likely to have some of the chronic conditions than were respondents. While many differences were found, most were not large in a practical sense. Furthermore, across most of these measures, weighted respondent distributions were closer to benchmarks than unweighted respondent distributions, suggesting that the potential bias identified via these analyses is expected to be minimized by the weighting procedures. In contrast to most surveys, the MCBS has a large amount of information to characterize non- respondents. This information, including Medicare claims data, can be used for imputation if necessary.
While the nonresponse bias analysis excluded Medicare Advantage (MA) enrollees from many analyses, it has been noted in recent years that MA beneficiaries are more likely to respond to the MCBS than those enrolled in original Medicare. Beginning in 2017, CMS introduced additional nonresponse adjustments and calibration of the MCBS weights to match enrollment benchmarks by Fee-for-Service (FFS)/MA status, to reduce or eliminate any potential bias the differential response rates by enrollment status may have introduced.
Over the rounds, the following patterns of nonresponse have been observed, which have or have not changed over time. In the most recent three rounds for which a full analysis of response rates have been completed, the round-level response rates for continuing panels remains high, ranging from 73.4% for the 2019 panel in Round 86 to 92.7% for the 2017 panel in Round 87. Despite these high rates, each year continuing panels are subjected to a nonresponse adjustment based on new response propensity models by panel. Incoming Panels at the first interview (e.g., the 2020 panel at Round 88) show a larger propensity for nonresponse due to having never been reached prior to the first interview. In Round 88 the response rate for the 2020 Incoming Panel was 41.9%. Once again, we rely on cells derived from response propensity models to account for differential effects of demographic and geographic characteristics on the resulting data. By accounting for these characteristics in constructing the adjustment cells, we reduce the potential for nonresponse bias that could arise due to these differential factors.
Adaptive design methods have also been applied to measure the representativeness of the MCBS incoming sample. In 2017, CMS conducted a review of the Representativity Indicators (R- indicators) or metrics for the Fall 2017 Baseline interview to monitor the representativeness of the achieved sample. The R-indicators provided a quantitative assessment of which segments of the sample were over/under producing and causing the achieved sample to be imbalanced in terms of sample representativeness.
A sample R-indictor as well as two partial R-indicators (variable and category) are used to monitor representativeness of the panel. The variable R-indicator measures the representativeness of the sample associated with each variable (looking at the strength of each co-variate subpopulation such as race, ethnicity, age, sex, region) to predict response propensity. The category R-indicator then looks at the categories of each variable to measure representativeness of the responding sample.
Since their inception, R-indicators have not been observed outside these thresholds; consequently, no data collection interventions were needed to improve the representativeness of the achieved sample. Use of R-indicators, along with a continual review of annual and historical response rates and non-response bias analysis are important tools in understanding response and ensuring that the sample as a whole, as well as subpopulations, are represented to produce high quality data. Future analysis will also focus on the R-indicators found in in-person data collection as compared to telephone data collection for the Baseline sample.
Due to the pandemic, MCBS data collection from March 2020 through November 2021 was phone only. Since late in 2021, some interviewing has slowly returned to in-person while retaining phone interviewing as a cost-efficient mode of survey administration. The MCBS is now multimode and includes both phone and in-person outreach and interviewing.
CMS has demonstrated the ability to conduct multimode MCBS data collection. Based on feedback from field staff and reviews of data, phone data collection works well for a majority of interviews; in-person is preferred for the collection of cost data from beneficiaries with large health care needs and for some subpopulations such as persons with sensory impairments. In addition, collection of physical measures must be done during an in-person interview.
Because the COVID-19 pandemic continues to pose risks, especially to some beneficiaries in the MCBS survey population, CMS has continued to observe reluctance among the majority of respondents and interviewers to conduct interviews in person. CMS will remain sensitive to respondent preferences while understanding the importance of in-person contact for recruitment to and long-term participation in the survey, particularly in critical Medicare populations such as those with high health care use and sensory impairments. CMS has and will continue to refine operational guidelines for multi-mode administration that includes in-person and phone data collection throughout the round based both on available data concerning the optimal method for contact as well as on-going pandemic uncertainty. Responsive multi-mode design during the pandemic requires continuous monitoring of both contact and response rates but also local and regional preferences and regulations related to COVID-19.
Table B-6 shows the long-term goal of data collection by mode and component for 2024 assuming the impact of the COVID-19 pandemic continues to subside. These percentages will likely be modified over time as we continue to operationalize and evaluate multi-mode data collection.
CMS also assumes a majority of Facility interviews will take place over the phone with a small proportion conducted in-person. This is based on finding that for Facilities new to the MCBS, in person outreach and interviewing appear to be the most successful means of gaining cooperation.
Table B-6: Anticipated Data Collection Mode and Component for 2023
Round |
Community Data Collection |
Facility Data Collection |
Summer |
85% in person; 15% by phone |
90% by phone; 10% in person |
Fall Baseline |
15% in person; 85% by phone |
90% by phone; 10% in person |
Fall Continuing |
85% in person; 15% by phone |
90% by phone; 10% in person |
Winter |
85% in person; 15% by phone |
90% by phone; 10% in person |
An analysis of paradata and response patterns demonstrates stability of the representativeness and quality of MCBS data collected via phone. An in-depth analysis of changes in response patterns between MCBS data collected during 2020 via telephone interviews compared with data collected prior to the pandemic via in-person interviews revealed limited evidence of data quality problems with phone administration. This analysis, which spanned all three data collection rounds of 2020 and included nearly 500 questionnaire variables and paradata variables from the Community and Facility Interviews, used a model-based approach to assess the stability of trends in response patterns from 2016 through 2019 and whether 2020 data maintained or broke those trends. Relatively large decreases in healthcare utilization and cost reporting were observed, particularly in the Community interviews in Rounds 86 and 88 and Facility interviews in Round 87, but these are likely due in large part to actual decreases in utilization during the pandemic and are consistent with findings from other analyses conducted about health care utilization among Medicare beneficiaries in 2020 using Fee for Service claims data35. While some additional utilization may have been missed due to mode changes, and the size of such an effect is difficult to ascertain, analyses showed that the MCBS FFS match and MA encounter data adjustments resulted in total utilization that was in line with expectations based on claims data. There were some indications of difficulties collecting data requiring physical access to documentation such as healthcare statements and prescription medicine. These difficulties were in part due to measures taken to reduce respondent burden. Overall, few questionnaire sections showed substantial shifts in response patterns or increases in item-level nonresponse in 2020. An additional analysis of 2020 and 2021 data collection efforts identified no major differences in likelihood of recruiting / retaining beneficiaries in the MCBS across a variety of demographic characteristics. This analysis did establish that beneficiaries with serious difficulty hearing or seeing may experience higher respondent burden for phone interviews, and as a result, in person outreach is prioritized for these subgroups. Further analysis will be conducted on 2022 and 2023 as rates of COVID-19 continue to stabilize and decline, thus allowing for greater in-person interviewing opportunities.
MCBS’ generic clearance for Questionnaire Testing and Methodological Research for the MCBS was first approved by OMB in May 2015 and most recently received approval for revision on June 24, 2021 (OMB No. 0938-1275, expiration 06/30/2024). The generic clearance encompasses development and testing of MCBS questionnaires, instrumentation, and methodological experiments. It contains approval for six types of potential research activities:
1) cognitive interviewing, 2) focus groups, 3) usability testing, 4) field testing (both within and outside the MCBS production environment), 5) respondent debriefing questionnaire, and 6) research about incentives. Any future changes to the MCBS instrumentation, data collection methods, or procedures that require testing will be submitted as individual collection requests under the generic clearance.
In October 2021, CMS conducted a small number of cognitive tests (six) with respondents to test the comprehension and sensitivity of new questionnaire items related to use of CBD for pain management and prevalence of bowel incontinence. All six tests were conducted on the English language version of the items. Minor updates were made to the bowel incontinence items based on this testing effort to improve comprehension and flow. The remaining item tested well and therefore did not require revision. In February 2022, two additional cognitive tests were conducted on the Spanish version of the items to assess comprehension and flow. All items performed well. None of the tested items were considered sensitive by English- or Spanish-speaking respondents. Results from this testing effort are included in Attachment 7.
In August-September 2022, CMS conducted eight cognitive tests with respondents to test comprehension and flow of new questionnaire items related to oral health (OHIP-5). Five of the eight tests were conducted in English and three tests were conducted in Spanish. The oral health items tested well and did not require revision. Results from this testing effort are included in Attachment 8.
The person responsible for statistical aspects of design is:
Edward Mulrow, Ph.D. Vice President
NORC at the University of Chicago
4350 East-West Highway, 8th Floor
Bethesda, MD 20814
(301) 634-9441
The contractor collecting the information is NORC at the University of Chicago.
1 Note that telephone numbers for beneficiaries are not available in the CMS administrative data used for sampling. Telephone numbers were appended to sampled addresses using vendor matching software; these numbers only sometimes reached the intended respondent. Additional manual locating was conducted by the field team to improve locating rates.
2 Note that prior to 2017, 107 PSUs were used for sampling for the MCBS. These included three PSUs in Puerto Rico. Beginning in 2017, Puerto Rico was removed from the MCBS sampling frame.
3 Beginning in 2017, the 18 SSUs selected from the three Puerto Rico PSUs were removed from the sampling frame, leaving 685 SSUs for sampling for the MCBS.
4 Note that the sample released was larger than previous MCBS samples due to the pivot from in-person to telephone interviewing and the associated expected lower rates of locating and response.
5 Events and costs incurred after enrollment in Medicare but prior to the first interview.
6 Neter J. Waksberg J. A Study of Response Errors in Expenditures Data from Household Interviews. Psychology. Journal of the American Statistical Association. March 1964.
7 https://www.whitehouse.gov/briefing-room/presidential-actions/2021/01/20/executive-order-advancing-racial-equity-and-support-for-underserved-communities-through-the-federal-government/
8 https://www.whitehouse.gov/briefing-room/presidential-actions/2021/01/20/executive-order-preventing-and-combating-discrimination-on-basis-of-gender-identity-or-sexual-orientation/
9 https://www.cms.gov/research-statistics-data-and-systems/research/mcbs/data-tables/2020-mcbs-financial-well-being-medicare-beneficiaries
12 Gorina Y, Schappert S, Bercovitz A, et al. Prevalence of incontinence among older Americans. National Center for Health Statistics. Vital Health Stat 3(36). 2014.
13 Farage MA, Miller KW, Berardesca E, Maibach H. Psychosocial and societal burden of incontinence in the aged population: a review. Archives of Gynecology and Obstetrics, 277: 285-290 (2008). https://link.springer.com/article/10.1007/s00404-007-0505-3
14 Brown HW, Guan W, Schmul NB, Smith PD, Whitehead WE, Rogers RG. If We Don’t Ask, They Won’t Tell Us: Screening for Urinary and Fecal Incontinence by Primary Care Providers. 2019, 31 (5) 774-782. JABFMhttps://www.jabfm.org/content/31/5/774.short
15 Whitehead WE, Borrud L, Goode PS, Meikle S, Mueller ER, Tuteja A, et al. Fecal incontinence in U.S. adults: epidemiology and risk factors. Gastroenterology, 2009 Aug; 137 (2): 512-7. https://pubmed.ncbi.nlm.nih.gov/19410574/
16 Kunduru L, Min Kim S, Keymen S, Whitehead W. Factors that affect consultation and screening for fecal incontinence. Clin Gastroenterol Hepatol. 2015 Apr; 13 (4): 709-16. https://pubmed.ncbi.nlm.nih.gov/25148761/
17 Bharucha AE, Locke GR 3rd, Seide BM, Zinsmeister AR. A new questionnaire for constipation and faecal incontinence. Aliment Pharmacol Ther. 2004 Aug 1;20(3):355-64. doi: 10.1111/j.1365-2036.2004.02028.x. PMID: 15274673
18 National Institutes of Health. Oral Health in America: Advances and Challenges. Bethesda, MD: US Department of Health and Human Services, National Institutes of Health, National Institute of Dental and Craniofacial Research, 2021. https://www.nidcr.nih.gov/research/oralhealthinamerica/section-3b-summary
20 Gil-Montoya, J. A., de Mello, A. L., Barrios, R., Gonzalez-Moles, M. A., & Bravo, M. (2015). Oral health in the elderly patient and its impact on general well-being: a nonsystematic review. Clinical interventions in aging, 10, 461–467. https://doi.org/10.2147/CIA.S54630)
21 Patel N, Fils-Aime R, Li CH, Lin M, Robison V. Prevalence of Past-Year Dental Visit Among US Adults Aged 50 Years or Older, With Selected Chronic Diseases, 2018. Centers for Disease Control and Prevention. Research Brief. Volume 18, April 29, 2021. https://www.cdc.gov/pcd/issues/2021/20_0576.htm
22 Centers for Disease Control and Prevention. Oral Health Surveillance Report: Trends in Dental Caries and Sealants, Tooth Retention, and Edentulism, United States, 1999–2004 to 2011–2016. US Dept of Health and Human Services; 2019.
23 https://www.cms.gov/files/document/mcbs-data-highlight-utilization-dental-vision-and-hearing-care-services-2019.pdf
25 Naik A, John MT, Kohli N, Self K, Flynn P. Validation of the English-language version of 5-item Oral Health Impact Profile. J Prosthodont Res. 2016 Apr;60(2):85-91. doi: 10.1016/j.jpor.2015.12.003. Epub 2016 Jan 11. PMID: 26795728; PMCID: PMC4841723
26 https://www.natividad.com/wp-content/uploads/2018/04/Natividad-Diabetes-Questionnaire-English.pdf
27 Chulis GS, Eppig FJ, Hogan MO, Waldo DR, Arnett RH. Health Insurance and the Elderly: Data from the MCBS. Health Care Financing Review; 1993: Volume 14, Number 3.
28 ASPE Office of Health Policy. Medicare Beneficiary Enrollment Trends and Demographic Characteristics. March 2, 2022. Available from: https://aspe.hhs.gov/sites/default/files/documents/f81aafbba0b331c71c6e8bc66512e25d/medicare-beneficiary-enrollment-ib.pdf
29 2023 Annual Report of the Board of Trustees of the Federal Hospital Insurance and Federal Supplementary Medical Insurance Trust Funds. Available from: https://www.cms.gov/oact/tr/2023
30 https://www.cdc.gov/nchs/nhis/2022nhis.htm
31 https://ftp.cdc.gov/pub/Health_Statistics/NCHS/Survey_Questionnaires/NHIS/2022/EnglishQuest-508.pdf
33 https://www2.census.gov/programs-surveys/demo/technical-documentation/hhp/Phase_3-7_Household_Pulse_Survey_ENGLISH.pdf
35 For more information, please refer to: https://aspe.hhs.gov/system/files/pdf/264071/Medicare-FFS-Spending-Utilization.pdf
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | CMS-P-0015A (0938-0568) Supporting Statement B |
Subject | Supporting Statement B |
Author | CMS |
File Modified | 0000-00-00 |
File Created | 2024-07-26 |