For Revision of Currently Approved Collection:
Medicare Current Beneficiary Survey (MCBS)
Contact Information:
William S. Long
Contracting Officer’s Representative, Medicare Current Beneficiary Survey Office of Enterprise Data and Analytics (OEDA)/CMS
7500 Security Boulevard, Mail Stop Mailstop B2-04-12 Baltimore, MD 21244
(410) 786-7927
[email protected] (410) 786-5515 (fax)
August 19, 2020
B1. Universe and Respondent Selection 3
B2. Procedures for Collecting Information 5
B3. Methods for Maximizing Response Rates and Dealing with Issues of Non-Response 16
B4. Tests of Procedures or Methods 18
B5. Individuals Consulted on Statistical Aspects of Design 18
Attachment 1: 60-day Federal Register Notice
Attachment 2: Community Advance Letter – English
MCBS Community Brochure – English
At the Door Sheet – English
MCBS Calendar – English
Income and Assets (IAQ) Brochure – English
Community Authority Letter
CMS Thank You Letter (Community) – English
MCBS Respondent Newsletter
Non-response letter – Continuing – English
Attachment 3: Community Instrument (Baseline and Continuing) and Showcards
Attachment 4: Facility Eligibility Screener
Attachment 5: Facility Instrument (Baseline and Continuing) and Showcards
Attachment 6: Facility Advance Letter - English
MCBS Facility Brochure – English
Resident Consent Form
Next of Kin Consent Form
HIPAA Letter – English
Attachment 7: CAPI Screenshots of Introductory Screen and Thank You Screen
The revision to this OMB package includes the following modifications to the Community instrument sections:
Revise the Beneficiary Knowledge and Information Needs Questionnaire (KNQ) to add four items on the use of the Internet for health care related information.
Add five items about malnutrition to the HFQ, including three items about the use of dietary supplements and two items about unintentional weight loss.
Revise the Physical Measures Questionnaire (PXQ) to include measures of grip strength for both hands.
The target universe is current Medicare beneficiaries entitled to hospital and/or supplementary medical insurance and living in the 50 states or the District of Columbia. Both institutionalized and non-institutionalized beneficiaries are represented. Table B.1 summarizes the number of beneficiaries in the target universe based on CMS administrative records through 2018 and projected estimates for 2019. The seven age groups shown in the table correspond to the primary sampling strata from which the samples for the MCBS are drawn. The age groups are defined by the beneficiaries’ age as of July 1 of the given year for 2014 and 2015, and as of December 31 of the given year for 2016 and later.
Table B.1: Universe Counts Broken Down by MCBS Age Groups (in thousands)
Age Interval |
2014 |
2015 |
2016 |
2017 |
2018 |
2019 (est.) |
Disabled <45 |
2,081.98 |
1,938.78 |
1,888.80 |
1,842.08 |
1,791.78 |
1,757.28 |
45 to 64 |
7,147.45 |
7,207.86 |
7,150.16 |
7,076.64 |
6,903.46 |
6,846.93 |
Total |
9,229.42 |
9,146.64 |
9,038.96 |
8,918.72 |
8,695.24 |
8,604.20 |
Aged |
||||||
65 to 69 |
13,541.48 |
15,312.60 |
15,727.66 |
15,767.28 |
15,978.62 |
16,261.23 |
70-74 |
10,973.99 |
11,640.90 |
12,401.12 |
13,080.94 |
13,647.66 |
14,319.39 |
75-79 |
7,890.82 |
8,314.00 |
8,607.10 |
9,080.94 |
9,463.14 |
9,926.16 |
80-84 |
5,767.31 |
5,999.42 |
6,069.32 |
6,137.60 |
6,301.04 |
6,515.46 |
85+ |
6,626.77 |
7,045.62 |
6,976.84 |
7,021.14 |
7,001.80 |
7,015.27 |
Total |
44,800.37 |
48,312.54 |
49,782.04 |
51,087.90 |
52,392.26 |
54,037.50 |
Total |
54,029.80 |
57,459.18 |
58,821.00 |
60,006.62 |
61,087.50 |
62,641.71 |
Source: Historical counts for 2014 are based on full Medicare administrative records. Historical counts for 2015-2018 are based on a 5-percent extract of the Medicare administrative records and are computed as 20 times the extract counts.
Notes: Puerto Rico beneficiaries are excluded from counts beginning in 2017 by sample design. Projections (2019) from the historical counts are based on the annual rate of change from 2016-2018.
Totals do not necessarily equal the sum of rounded components.
The target sample size of the MCBS has been designed to yield 9,4671 completed cases providing 2018 Cost Supplement data per year (approximately 800-900 disabled enrollees under the age of 65 in each of two age strata, and 1,400-1,700 enrollees in each of five age strata for enrollees 65 and over).
To achieve the desired number of completed cases, the MCBS selects new sample beneficiaries each year (referred to as the incoming panel) to compensate for nonresponse, attrition, and retirement of sampled beneficiaries in the oldest panel (referred to as the exit panel) and to include the current-year enrollees, while continuing to interview the non-retired portion of the continuing sample. The incoming panel is always added in the Fall round (also referred to as the baseline interview); the retiring or exit panel occurs in the winter round (and is the 11th and final interview for all respondents).
Each year, an analysis of non-response and attrition is conducted to determine the optimal sample size for the fall round incoming panel. Through 2009, approximately 6,500 beneficiaries were added to the sample in the fall (September – December) round each year to replace the exiting panel and to offset sample losses due to non-response and attrition. Beginning in the fall round of 2010, the number of beneficiaries included in the incoming panel was increased to approximately 7,400 to compensate for declining response rates. Over the decade, the incoming panel sample has gradually increased to approximately 11,500. The sample size results in about 36,000 interviews completed per year.
Proxy interviews are attempted for deceased sample persons. If data are collected through the date of death, then such cases are counted as completes. Sampled beneficiaries remain in the survey when they are unavailable for an interview in a given round; that is, they are carried forward into the next round. For these individuals, the reference period for their next interview is longer as it covers the period since their last interview; this ensures that there will not be a gap in coverage of utilization and expenditure data. If a sampled beneficiary is not interviewed for two consecutive rounds, they are not scheduled for any further interviews and are taken out of case management. Such cases are treated as nonresponding cases.
The methodology for drawing the samples is described later in this document. The number of cases to be selected each year for the incoming panel (designated sample sizes) are larger than the targeted number of completes to compensate for non-response, ineligibility, and attrition. To see an illustration of the extent of the compensation necessary in Fall 2018 Round 82 to achieve the desired number of cases providing annual data, see Table B.2.
Table B.2: Sample Size Needed to Compensate for Initial Non-Response and Ineligibility in the 2018 Fall Round
Table B.2: Sample Size Needed to Compensate for Initial Non-Response and Ineligibility
Age on December 31 of reference year |
Desired average number of cases providing annual data |
Number sampled at Fall 2018 Round 82 |
18-44 |
343 |
1,184 |
45-64 |
332 |
864 |
65-69 |
687 |
2,217 |
70-74 |
600 |
1,609 |
75-79 |
603 |
1,747 |
80-84 |
620 |
1,837 |
85+ |
648 |
2,090 |
Total |
3,833 |
11,548 |
Cross-sectional sample sizes for other domains. There are multiple domains of interest in the MCBS, (for example, respondents with end-stage renal disease, persons residing in nursing homes, managed care enrollees, beneficiaries of various race and ethnic backgrounds, and Medicaid recipients). The MCBS will continue to maintain a minimum target of 9,000 completed responses in the annual Cost Supplement file to ensure that analysis can be performed on MCBS data for many domains of interest.
Sample sizes for longitudinal analyses. Beginning in 2018, under the rotating panel design specified for the MCBS, respondents remain in the sample for up to eleven rounds of data collection over a four year period; prior to 2018, respondents remained in the sample for up to twelve rounds of data collection. The historical response rates and attrition rates observed in the MCBS are used to determine the rotational sample size and configuration of each new incoming panel. The rotational sample design attempts to achieve consistency in subgroup sample sizes across all panels comprising a particular calendar year.
Table B.3 (in section B2 below) presents the round-by-round conditional and unconditional response rates as of Round 76 (Fall round of 2016) for the samples (referred to in the table as “panels”) selected in 2010 through 2017. For example, from the bottom part of the table, it can be seen that by the 10th round of data collection for the 2014 panel, 23.2 percent of the 2014 panel were still in a formal responding status (that is, either the sampled beneficiary was alive and still participating in the study or had died but a cooperative proxy was found for the collection of data on the last months of life) or had participated in the survey until death, leaving enough data to estimate the last months of life. For the 2015 and 2016 panels, the unconditional response rates as of Round 79 were 25.6 percent (through the 7th round of data collection) and 33.3 percent (through the 4th round of data collection), respectively. The 2017 panel (the new panel selected in Round 79) had an initial response rate of 55.3 percent in its first round of data collection.
Round 79 (Fall 2019) is the latest round for which MCBS data have been fully processed. There were 2,467 interviews successfully completed at Round 79 with still-living members of the 2014 panel. For brevity, we refer to these 2,467 interviews as “live completes.” For the 2015 and 2016 panels there were 2,059 and 3,770 live Round 79 completes, respectively. For the first round of data collection for the 2017 panel, there were 6,189 completes at Round 79.
The MCBS has used a variety of techniques to maintain respondents in the survey and reduce attrition. These will be continued and adapted to comply with the time frames for initiating and implementing the continuing sample.
This section describes the procedures used to select the samples for the national survey. It includes a general discussion of the statistical methodology for stratification and rotational panel selection, estimation procedures, and the degree of accuracy needed. This is followed by a presentation of how instrument sections are used to enhance the analytic potential of the MCBS data. Finally, there is a discussion of rules for allowing proxy response.
This section opens with a description of the MCBS sample design. This is followed by a general discussion of the selection of the original and annual new incoming samples and the use of Medicare administrative enrollment data each year to reduce problems associated with duplication of samples across the years.
PSU and Census tract clustering. The MCBS employs a complex multistage probability sample design. At the first stage of selection, the sample consists of 1042 primary sampling units (PSUs) defined to be metropolitan areas and clusters of nonmetropolitan counties. At the second stage of selection, samples of Census tracts are selected within the sampled PSUs. At the third and final stage of selection, stratified samples of beneficiaries within the selected Census tracts are sampled at rates that depend on age group and ethnicity.
The strata used for selection of the PSUs covers the 50 states and the District of Columbia. Since PSUs were selected randomly with probabilities proportionate to size, there are some states without any sample PSUs within their boundaries. Within major strata defined by region and metropolitan status, PSUs were sorted by percent of beneficiaries enrolled in HMOs and/or percent of beneficiaries who are minorities based on data in CMS administrative files. Substrata of roughly equal size were created from the ordered list for sample selection.
In 2014, within the PSUs, a sample of 703 second-stage units (SSUs) consisting of Census tracts or clusters of adjacent tracts was selected. There were several steps in the SSU sampling process. First, an extract of the entire Medicare administrative enrollment data was obtained, and all beneficiaries’ addresses were geocoded to the tract level. A minimum measure of size was used to determine whether a Census tract was large enough (i.e., had enough Medicare beneficiaries) to stand on its own as an SSU or would need to be combined with one or more adjacent tracts. A frame of 24,212 SSUs was then constructed, and a sample of 703 SSUs was selected using systematic probability proportional to size. These SSUs have been used for sampling MCBS beneficiaries since 20143 and were sized to be used for up to 20 years. An additional sample of 339 reserve SSUs was also selected to support an expansion of the sample or the study of special rare populations in future years. To date, these reserve SSUs have not yet been used for sampling for the MCBS.
Table B.3: Conditional and Unconditional Response Rates as of the 2017 Panel for Medicare Current Beneficiary Survey by Interview Round
Conditional Response Rates for Medicare Current Beneficiary Survey by Interview |
||||||||
|
||||||||
|
2010 Panel (n=7260) |
2011 Panel (n=7365) |
2012 Panel (n=7400) |
2013 Panel (n=7400) |
2014 Panel* (n=11398) |
2015 Panel (n=8621) |
2016 Panel (n=12145) |
2017 Panel (n=11623) |
Round 1 |
77.5% |
77.4% |
73.2% |
72.8% |
58.7% |
53.3% |
54.7% |
55.3% |
Round 2 |
89.0% |
88.7% |
87.6% |
87.4% |
*** |
83.2% |
81.4% |
|
Round 3 |
92.7% |
91.4% |
92.4% |
92.1% |
82.1% |
82.7% |
83.9% |
|
Round 4 |
93.3% |
91.9% |
92.3% |
78.5% |
84.1% |
80.0% |
84.2% |
|
Round 5 |
94.8% |
94.0% |
94.3% |
*** |
85.9% |
88.3% |
|
|
Round 6 |
94.7% |
95.4% |
94.3% |
86.9% |
81.1% |
88.0% |
|
|
Round 7 |
94.2% |
94.8% |
80.7% |
87.6% |
83.4% |
87.7% |
|
|
Round 8 |
96.2% |
96.2% |
*** |
89.8% |
91.1% |
|
|
|
Round 9 |
96.8% |
96.3% |
89.8% |
82.2% |
89.7% |
|
|
|
Round 10 |
97.1% |
86.2% |
90.1% |
87.9% |
90.3% |
|
|
|
Round 11 |
98.8% |
*** |
93.1% |
94.4% |
|
|
|
|
Round 12 |
99.6% |
96.9% |
96.0% |
97.2% |
|
|
|
|
Unconditional Response Rate for Medicare Current Beneficiary Survey by Interview Round |
||||||||
Round 1 |
77.5% |
77.4% |
73.2% |
72.8% |
58.7% |
53.3% |
54.7% |
55.3% |
Round 2 |
68.8% |
68.5% |
63.9% |
63.4% |
*** |
44.2% |
44.3% |
|
Round 3 |
60.4% |
62.6% |
58.6% |
57.9% |
48.1% |
‡31.7% |
38.1% |
|
Round 4 |
59.6% |
57.2% |
53.5% |
44.8% |
40.1% |
32.9% |
33.3% |
|
Round 5 |
55.8% |
53.4% |
50.1% |
*** |
35.8% |
31.3% |
|
|
Round 6 |
52.5% |
50.1% |
46.4% |
42.1% |
‡21.9% |
28.1% |
|
|
Round 7 |
49.0% |
47.3% |
37.2% |
36.6% |
28.4% |
25.6% |
|
|
Round 8 |
46.8% |
45.1% |
*** |
33.6% |
27.1% |
|
|
|
Round 9 |
44.7% |
42.5% |
35.5% |
‡20.2% |
24.6% |
|
|
|
Round 10 |
43.0% |
36.3% |
31.8% |
28.6% |
23.2% |
|
|
|
Round 11 |
42.0% |
*** |
30.5% |
28.0% |
|
|
|
|
Round 12 |
38.3% |
34.4% |
27.4% |
25.3% |
|
|
|
|
* The 2014 panel response rate was impacted by several operational design changes recognized during the transition between contractors in 2014, including an extensive CAPI instrument development effort originally considered out-of-scope for transition purposes, the initial need to release a larger 2014 incoming panel sample to account for a smaller continuing sample fielded in the fall of 2014, the hiring and training of 100 new interviewers for MCBS data collection, and the decision to extend the incoming panel data collection through the release of additional replicates in December 2014, resulting in a shorter data collection period and consequently lower response rate for 2,500 sample members.
** Not available because the 2015 winter and summer rounds (R71 and R72) were combined for data collection in this year only. Again, this was due to transition activities that started in 2014 and were completed in 2015.
‡ In Summer 2016 (Round 75), some cases were intentionally not fielded and instead were included in an early case release for Fall 2016 (Round 76). The grey highlighted rates reflect the rounds and panels in which this field strategy was used. The resulting unconditional response rates for the 2013-2015 panels in the 9th, 6th, and 3rd rounds, respectively, were lower than they would have been had the cases been fielded, but increased again in the subsequent rounds.
Selection of beneficiaries. In the Fall 2018 Round 82, an incoming panel sample of 11,548 beneficiaries was selected from the Medicare administrative enrollment data. This sample was clustered within the selected PSUs and SSUs and was designed to achieve uniform sampling weights within each strata. Beginning in 2015, beneficiaries eligible anytime during the sampling year are also included in the Medicare administrative enrollment sampling frame (referred to as current-year enrollees). Their inclusion allows for the release of data files up to one year earlier than previously possible.4 Also beginning in 2015, Hispanic beneficiaries living outside of Puerto Rico were oversampled. Nursing home residents are drawn into the sample in exactly the same manner as other beneficiaries residing in the community.
To date, sampling weights have been calculated for each Fall round (1, 4, 7…, and 76) in order to produce the Survey File limited data sets (previously referred to as the Access to Care files), and for each calendar year in order to produce the Cost Supplement limited data sets (previously referred to as the Cost and Use files) In both cases, cross-sectional and longitudinal weights have been calculated. Some questionnaire sections fielded in the Winter or Summer rounds have specific cross-sectional weights calculated for them as well. In all cases, weights reflect differential probabilities of selection and differential nonresponse, and are adjusted to account for overlapping coverage of the panels included in the data files. Replicate weights were also calculated so that users can calculate standard errors using replication methods. In addition to the replicate weights, stratum and unit codes exist on each weight file for users who prefer to use Taylor Series methods to estimate variances.
Besides standard weighting and replicate weighting, another part of the estimation program includes the full imputation of the data sets to compensate for item non-response. Imputation of charges for non-covered services and sources of payment for covered services in the Cost Supplement files have been developed. Beginning with the 2015 data, unit-level imputation was also instituted to compensate for missing initial-round utilization and cost data5 for current-year enrollees. The weighting and imputation of data continue each year.
A broad range of statistics are produced from the MCBS. There is no single attribute of beneficiaries and their medical expenses that stands out as the primary goal of the survey. Thus, there can be no simple criterion for the degree of reliability that statistics for each analytic domain should satisfy. Even with a larger sample size of 14,000 to 15,000 persons, there would be many small domains of interest for which it would be necessary to use modeling techniques or to wait several years for sufficient data to accumulate.
The MCBS will maintain a stratified approach to the selection of the sample. The sample will continue to be clustered by PSU and Census tract-based SSU and stratified by age domain and race/ethnicity; the tract-based SSU approach was an innovation first begun in 2014 which has resulted in greater efficiencies and increased analytic opportunities. We anticipate maintaining a total of 700-900 annual cases allocated to the two younger age categories for disabled beneficiaries who are not yet 65. The two age categories were selected because they indirectly reflect the means by which the disabled person becomes eligible for Medicare. Since the number of disabled sample persons per PSU and Census tract will be small, the effects of clustering on statistical precision should be mild for this subgroup. For example, depending on the prevalence of the characteristic being estimated, the MCBS has achieved standard errors for estimates of percentages ranging from 2-3% or lower for subgroup estimates based on 1,000 respondents.
Since many of the cost and reimbursement statistics derived from the MCBS may be heavily right-skewed (i.e., reflecting the higher end of the cost/reimbursement spectrum to a disproportionate degree), the accuracy may be lower in relative terms but still acceptable. For example, the relative standard error of the mean total Medicare reimbursements derived from the MCBS has generally ranged from 2.0-2.5% for the total sample, and 4.0-8.0% for subgroups.
Each of the age strata for the Medicare sample age 65 and over will be allocated 1,200-1,700 cases, with the oldest stratum (age 85 and over) being allocated about 1,600 cases with oversampling. A major reason for oversampling the very old is to obtain an adequate sample of nursing home stays. Variations in sampling weights across the age strata and clustering within PSU and Census tract will inflate sampling errors, but the resulting effective sample sizes should be adequate for most analyses.
Content and timing of instrument sections.
The primary variables of interest for the MCBS are the use and cost of health care services and associated sources and amounts of payment. While Medicare claims files supply information on billed amounts and Medicare payments for covered services, the survey provides important self-reported information on use of services not covered by Medicare and on payment sources and amounts for costs not reimbursed by Medicare. For both the Community and Facility components, the primary focus of the data collection is on use of services (dental, hospital, physician, medical providers, prescription medication and other medical services), sources and amounts of payment, and health insurance coverage. The MCBS interview collects continuous information on these items through thrice-yearly interviews; that is, once a new respondent completes their baseline interview, they are asked utilization and cost questions each round.
Continuous data on utilization and expenditures are required for a number of reasons. First, several of the distinct expenditure categories involve relatively rare medical events (inpatient hospital stays, use of home health care, purchase of durable medical equipment, and so forth), so limiting the reference period would mean insufficient observations for annual estimates. Second, episodes of medical care often consist of a series of services over weeks or months; data collected several times a year allow examination of the grouping of services and costs around particular episodes of care. Third, payment for medical services often occurs considerably later than the utilization, so collection of complete information about a particular event can often only be obtained sometime after the event occurs.
The administration of the instruments will continue to follow the established pattern of data collection. Baseline information will be collected in the initial interview with new incoming panel respondents. This will be followed with 10 interviews to collect utilization, cost and other important topics. Since the initial interview always occurs in the last four months of a calendar year, collection of utilization and expenditure data in the second interview means the reference period will always begin prior to January 1st. This creates use and expenditure estimates on a calendar year basis.
The literature (initially reported by Neter and Waksberg in 1964, and confirmed in subsequent research by other analysts) indicates that collection of behavioral information in an unbounded recall period can result in large recall errors. The incoming panel interviews covered in this clearance request - Fall 2021 (Round 91), Fall 2022 (Round 94), and Fall 2023 (Round 97) -prepares the respondent for the collection of utilization and expenditure information in subsequent rounds, thus “bounding” the recall period for the next interview. During the baseline interview, the respondent is provided with a calendar and interviewers emphasize the importance of this tool for use in future interviews. This calendar marks the recall period for the respondent and serves as the means to record utilization as well as a prompt to retain statements and bills.
Content of the instruments, Rounds 89-97.
Nearly all of the instruments sections as currently approved by OMB are unchanged. Table B.4 presents the core and topical sections that comprise the MCBS Community instrument. As shown in the table, the content and order of administration varies based on season of data collection (Fall, Winter, Summer) and the type of interview (Baseline, Continuing). Those sections with an asterisk (*) include a revision contained in this clearance request (either adding or deleting questions). Occasionally an item may be moved from one questionnaire section to another to improve the flow and use of the data, or for other operational or analytic purposes.
Table B.4: Community Instrument Sections and Order of Administration
Section Listed in the order in which the section is administered. |
Type of Section (Core or Topical) |
Season of Administration (Rounds Administered) |
Interview Type (Baseline, Continuing, Both) |
Introduction (INQ) |
Core |
All (Round 89-97) |
Both |
Enumeration (ENS) |
Core |
All (Round 89-97) |
Both |
Housing Characteristics (HAQ) |
Topical |
Fall (Rounds 91, 94, 97) |
Both |
Health Insurance (HIQ) |
Core |
All (Round 89-97) |
Both |
Dental, Vision, and Hearing Care Utilization (DVH) |
Core |
All (Round 89-97) |
Continuing |
Emergency Room Utilization (ERQ) |
Core |
All (Round 89-97) |
Continuing |
Inpatient Utilization (IPQ) |
Core |
All (Round 89-97) |
Continuing |
Outpatient Utilization (OPQ) |
Core |
All (Round 89-97) |
Continuing |
Institutional Utilization (IUQ) |
Core |
All (Round 89-97) |
Continuing |
Home Health Utilization (HHQ) |
Core |
All (Round 89-97) |
Continuing |
Medical Provider Utilization (MPQ) |
Core |
All (Round 89-97) |
Continuing |
Access to Care (ACQ) |
Core |
Winter (Rounds 89, 92, 95) |
Continuing |
Prescribed Medicine Utilization (PMQ) |
Core |
All (Round 89-97) |
Continuing |
Other Medical Expenses (OMQ) |
Core |
All (Round 89-97) |
Continuing |
Statement Cost Series (STQ) |
Core |
All (Round 89-97) |
Continuing |
Post-Statement Cost (PSQ) |
Core |
All (Round 89-97) |
Continuing |
No Statement Cost Series (NSQ) |
Core |
All (Round 89-97) |
Continuing |
Cost Payment Summary (CPS) |
Core |
All (Round 89-97) |
Continuing |
Mobility of Beneficiaries (MBQ) |
Topical |
Fall (Rounds 91, 94, 97) |
Both |
Preventive Care (PVQ)6 |
Topical |
All (Round 89-97) |
Both |
Health Status and Functioning (HFQ)* |
Core |
Fall (Rounds 91, 94, 97) |
Both |
Physical Measures (PXQ)* |
Core |
Fall (Rounds 91, 94, 97) |
Baseline |
Chronic Pain (CPQ) |
Topical |
Summer (Rounds 90, 93, 96) |
Continuing |
Physical Measures (PXQ)* |
Core |
Summer (Rounds 90, 93, 96) |
Continuing |
Nicotine and Alcohol Use (NAQ) |
Topical |
Fall (Rounds 91, 94, 97) |
Both |
Satisfaction with Care (SCQ) |
Core |
Fall (Rounds 91, 94, 97) |
Both |
Demographics and Income (DIQ) |
Core |
Fall (Rounds 91, 94, 97) |
Baseline |
Beneficiary Knowledge and Information Needs (KNQ)* |
Topical |
Winter (Rounds 89, 92, 95) |
Continuing |
Usual Source of Care (USQ) |
Core |
Winter (Rounds 89, 92, 95) |
Continuing |
Income and Assets (IAQ) |
Core |
Summer (Rounds 90, 93, 96) |
Continuing |
Drug Coverage (RXQ) |
Topical |
Summer (Rounds 90, 93, 96) |
Continuing |
Cognitive Measures (CMQ)7 |
Core |
Fall (Rounds 91, 94, 97) |
Both |
End Section |
Core |
All (Round 89-97) |
Both |
The Facility instrument collects information that is similar in content to the Community instrument. Table B.5 presents the sections that comprise the MCBS Facility instrument; all sections are considered core. As with the Community instrument, the content and order of administration varies based on season of data collection (Fall, Winter, Summer) and the type of interview (baseline, continuing).
Section |
Season of Administration (Rounds Administered) |
Interview Type (Baseline, Continuing, Both) |
Facility Questionnaire (FQ) |
All (Round 89-97) |
Both |
Residence History (RH) |
All (Round 89-97) |
Both |
Background Questionnaire (BQ) |
Fall (Rounds 91, 94, 97) |
Baseline |
Health Insurance (IN) |
All (Round 89-97) |
Both |
Use of Health Services (US) |
All (Round 89-97) |
Continuing |
Expenditures (EX) |
All (Round 89-97) |
Continuing |
Health Status (HS) |
Fall (Rounds 91, 94, 97) |
Both |
Facility Questionnaire Missing Data^ |
All (Round 89-97) |
Both |
Residence History Missing Data^ |
All (Round 89-97) |
Both |
Background Questionnaire Missing Data^ |
Fall (Rounds 91, 94, 97) |
Baseline |
^Section only activated and available for administration when critical data points from the FQ, RH, or BQ sections are marked as missing, Don’t Know, or Refused.
The revision to this OMB package includes the following content changes to the Community instrument.
Summary of instrument changes beginning in Winter 2021 Round 89 through Fall 2023 Round 97:
Revise the Beneficiary Knowledge and Information Needs Questionnaire (KNQ) to add four items on the use of the Internet for health care related information.
Add five items about malnutrition to the HFQ, including three items about the use of dietary supplements and two items about unintentional weight loss.
Revise the Physical Measures Questionnaire (PXQ) to include measures of grip strength for both hands.
Additions to Beneficiary Knowledge and Information Needs (KNQ)
Beginning in Winter 2021 Round 89, the MCBS will add four items to KNQ as part of the Winter round interview to assess how Medicare beneficiaries, and the elderly, use the internet for health care related information and access. The items ask whether the respondent has used the internet (broadly defined as use of a personal computer, iPad, tablet, cell-phone, Web-TV, or digital assistance technology such as Amazon Alexa) to look up health information, renew a prescription, schedule an appointment with a health care provider, or communicate with a health care provider in the past year. These items are being added to the MCBS for programmatic considerations within CMS regarding the streamlining and access to beneficiaries’ own health data as well as online Medicare enrollment/re-enrollment. Questions on access to and participation in various forms of internet use, assist CMS in understanding how the beneficiary makes plan enrollment and provider decisions, such as HOSPITAL, PHYSICIAN, and NURSING HOME COMPARE, and the Prescription Drug Plan Finder to compare plan options. As more health providers and plans are driving consumers to the internet, it is important to understand if it is a barrier for the elderly and the Medicare population. These topics align with the goals of the MCBS, which include providing information about Medicare beneficiaries that is not available in CMS administrative data and that is uniquely suited to evaluate or report on key outcomes and characteristics associated with beneficiaries and how to provide them with information that will help to improve their plan enrollment knowledge, provider knowledge, and communication with providers so that they can better utilize the health care system. These items were taken from the National Center for Health Statistics’ 2018 National Health Interview Survey (NHIS) (OMB control number 0920-0214).
Revise HFQ to Add Items on Malnutrition
There are many forms of malnutrition in the U.S. population, including micronutrient deficiencies. The Dietary Guidelines for Americans 2015–2020 specifically identified vitamins A, C, D and E, calcium, magnesium, iron, potassium, choline and fiber as “under consumed nutrients” in adults, including some older adults. However, older adults are high consumers of dietary supplements, with about 70% of the U.S. non-institutionalized population reporting use of at least one in the past 30 days8. Recommended intakes for many of these micronutrients can be met by users through nutrient supplementation alone. Therefore even without data on diet, information on dietary supplement use can provide critical information on likely micronutrient adequacy.
A second form of malnutrition is unintentional weight loss. The prevalence of undernutrition and underweight in older adults living in the community is often considerable, particularly among the very old and especially among those with chronic health conditions and functional limitations. Frailty and malnutrition due to unintentional weight loss are common, often overlapping conditions that increase in prevalence with age.
Frailty is now one of the top ten geriatric concerns and is clinically tied to increased risk of negative health outcomes such as falls, hospitalization, disability, and death. Data from the National Health and Aging Trends Study estimated that 15% of community dwelling older adults were frail and 45% were pre-frail9. Additionally, up to one in two older adults are at risk for malnutrition; both conditions can decrease functionality and healthy aging. Yet, no national surveys provide definitive estimates of the prevalence of both conditions in the same population of older adults.
To collect data on frailty and malnutrition for the National Institutes of Health, Office of Dietary Supplements (NIH-ODS), the MCBS is adding five items to the survey beginning in Fall 2021 Round 91. Fewer than 10 cognitive tests were conducted on all new items. The items described below have all been used in national surveys and validated questionnaires that include Medicare beneficiaries in their target population and have performed well from both the standpoints of participant understanding and burden, response time, and administrative burden. Therefore, the focus in our small number of cognitive tests was to determine changes to survey administration time and impact to respondent burden and questionnaire flow.
Three items about dietary supplement use, including:
One item from the National Center for Health Statistics’ US National Health and Nutrition Examination Survey (NHANES) (OMB control number 0920-0950) asking about the use of dietary supplements,
One item from the National Institutes of Health, National Cancer Institute, Diet History Questionnaire (DHQ) Version I that was collected as a supplement to the National Health Interview Survey (OMB control number 0920-0214) asking about the use of multivitamins, and
One item from the National Institutes of Health, National Cancer Institute, Diet History Questionnaire (DHQ) Version III that was collected as a supplement to the National Health Interview Survey (OMB control number 0920-0214), asking which vitamins and supplements were taken in the past year.
Two items about unintentional weight loss from the Canadian Nutrition Screening Tool (CNST).
These questions will provide useful information on the prevalence of dietary supplement use in this population with very high use of supplements, as well as use of prescription and over-the-counter drugs, raising risks of drug-supplement interactions. Their inclusion will also allow for the assessment of how dietary supplement use changes over time and provide an opportunity to examine associations between dietary supplement use and health outcomes.
Two additional questions on malnutrition (undernutrition) screening are also proposed to better assess the prevalence of unintentional weight loss. This information combined with other data from Medicare records will allow researchers to assess the prevalence of unintentional weight loss and its association with adverse health outcomes in this population. These data can also be used to assess if malnutrition is documented in Medicare records and potentially the associations between treatment for this type of malnutrition and improvement in health outcomes.
Revise Physical Measures Questionnaire to Include Grip Strength Measurements
In addition to measuring malnutrition, functional measures are important indicators needed to assess frailty in the Medicare population. Grip strength is often used to assess overall muscle strength in older adults, and tests of grip strength have been linked to health-related prognoses10. Low grip strength has been associated with falls, disability, length of hospital stays, and mortality11. Starting in Summer 2021 Round 90, the MCBS will add grip strength measures to the PXQ section to measure grip strength for the beneficiaries’ right and left hands. In the grip strength test, a dynamometer is used to measure the amount of force the respondent is able to apply with each hand. These measures are the cornerstone of screeners for frailty such as the Fried frailty phenotype assessment12. Combined with existing information on physical functioning, gait speed, grip strength and measured height and weight, measures of grip strength will also allow for more definitive identification of physical frailty in MCBS respondents.
The MCBS will incorporate the NIH-Toolbox (NIH-TB) protocol for measuring grip strength. In this protocol, respondents conduct one practice test and one measured grip strength test per hand, while sitting with their arm at a 90 degree angle. The test will be administered only for interviews conducted with the beneficiary, not for those conducted with the proxy. Beneficiaries who are missing one or both hands, or who feel uncomfortable performing the test, will skip the test. The NIH-TB tests are designed to be portable across study designs and of low respondent burden to meet the needs of researchers on large cohort studies. These assessments have been pre-tested and validated for participants aged 3-85 and a norming study has also been conducted.13,14,15 Accordingly, the NIH-TB grip strength protocol is designed to be used for in-home data collection without any additional equipment (beyond the dynamometer).
As described in Supporting Statement Part A, with the emergence of the COVID-19 pandemic in the U.S., CMS implemented a number of changes to the MCBS to ensure the health and safety of both respondents and field interviewers while continuing data collection. In March 2020, CMS paused in-person data collection in both community and facility settings. Field interviewers have conducted MCBS interviews by telephone since that time.
Collection of Physical Measures, including grip strength, assumes that in-person interviewing resumes in May of 2021; if phone interviews continues through the Summer and/or Fall round of 2021, the Physical Measures Questionnaire will not be administered.
Rounds 89 through 97 Data Collection Procedures
Interviews with incoming panel sample persons in community. In the Fall rounds (Round 91, 94, 97), all newly selected beneficiaries will be mailed a Community Advance Letter (Attachment 2) from the Centers for Medicare and Medicaid Services. If data collection is conducted in-person, field interviewers will carry copies of the advance letter for respondents who do not recall receiving one in the mail, as well as a copy of the MCBS Community Brochure and At the Door Sheet (Attachment 2).
The Community interviews (Rounds 89-97) will be administered to the respondent or a designated proxy using a CAPI program on a laptop computer. Attachment 3 includes a copy of all questionnaire sections administered in the baseline interview, the continuing interview, and the Showcards used by the interviewer to assist in the interviewing process.
At the completion of the baseline interview (Rounds 91, 94, 97), each new respondent is provided with a MCBS calendar (Attachment 2), on which he or she is encouraged to record health care events. The same calendar is provided to all Continuing Community respondents on a calendar year basis. When data collection is conducted by phone during the Fall round, the calendar is mailed to respondents.
Interviews with sample persons in institutions. All Facility interviews are administered to facility staff using a CAPI program on a laptop computer. For all facility residents, the Facility Eligibility Screener is administered each time a respondent is found to have entered a facility, or in the case of baseline respondents, is currently in a facility (Attachment 4). The Facility instrument to be used in Rounds 89-97 is shown in Attachment 5.
Some facility administrators will require consent of the sample person or a next of kin before releasing any information. The data collection contractor will offer to obtain such written consent, using the Resident Consent Form, and Next of Kin Consent Form. These forms as well as a HIPAA letter are included in Attachment 6.
Proxy rules.
For Community respondents, the preferred mode is self-response. Respondents are asked to designate proxy respondents. These are individuals who are knowledgeable about the respondent’s health care. In the MCBS, only those individuals who are designated by the respondents can serve as proxy respondents.
Upon screening a facility where a facility resident is residing, the interviewers determine the appropriate staff at the facility best able to respond. MCBS interviewers do not interview residents in a facility. Instead, interviewers are trained to determine and seek out the appropriate staff for the interview. When appropriate, interviewers abstract information from available facility records. If a respondent is incarcerated, we do not seek response. Other institutions will be treated on a case-by-case basis.
The sample for the MCBS is a heterogeneous population that presents a unique challenge for maximizing response rates. The survey selects respondents from two Medicare groups—those age 65 and over and those younger than 65 who have disabilities. Both of these groups have characteristics that often lead to refusals on surveys. Increasing age, poor health or poor health of a family member are prevalent reasons for refusal. On the other hand, older persons are the least mobile segment of the population and thus, for a longitudinal survey, less likely to be lost due to failure to locate. Recent data on the MCBS indicate that the population aged under 65 tends to have a slightly higher response rate than the aged population.
Because this is a longitudinal survey, it is essential that we maximize the response rates. In order to do so, data collection staff undertakes an extensive outreach effort each round. This includes the notification of government entities about the survey including CMS regional offices and hotline, carriers and fiscal intermediaries, and Social Security Offices, national organizations including the AARP and various community groups (e.g., social service and health departments, home health agencies, state advocates for the elderly and area agencies on aging). These efforts are undertaken to answer questions or concerns that respondents may have in order to increase the likelihood that respondents would participate in the MCBS and remain in the survey panel.
Specifically, efforts to maximize response rates include: 1) informing authoritative sources to whom respondents are likely to turn if they question the legitimacy of the MCBS; 2) giving interviewers resources to which they can refer to reassure respondents of the legitimacy/importance of the survey; and 3) generally making information about MCBS available through senior centers and other networks to which respondents are likely to belong or reach out (such as the 1-800-Medicare hotline).
CMS intensively monitors both unconditional and conditional response rates. The unconditional response rate is the percentage of sample that were released during the fall round of the selection year and responded to the survey in a given year. The unconditional response rates, also called cumulative response rates, use the original selected sample size as the baseline in their calculation. Conditional response rates are the percentage of sample that were eligible at the beginning of the fall round of a particular year and responded during that year. Conditional response rates use the sample who are eligible to participate in the survey (a subset of the sample released in the fall round of the selection year) as the baseline in their calculation. In other words, they are conditioned on eligibility. Both indicators are very important for understanding trends about response rates and where interventions should optimally be targeted. These trends are monitored over the full historical span of the survey, providing important insights in changes to response rates over time.
Response is also tracked throughout each round by a host of key indicators including panel, HHS region, age, race, ethnicity, residential status (community or facility), current year Medicare enrollees or not-current year enrollees. In addition, performance by field interviewers is also tracked to identify any staff who need additional training or support to improve their interview completion rates. CMS continually analyzes response rates, particularly for the subpopulations with the lowest propensity to respond, and is fully committed to finding ways to stem declining response rates.
In addition to outreach, the following efforts remain in place to maintain a sense of validity and relevance among the survey participants.
An advance letter is sent to both sampled beneficiaries and facility administrators from CMS with the CMS Survey Director’s signature. This includes an informational brochure answering anticipated questions (Attachment 2).
A handout with Privacy Act information and an appeal to participate is given to the respondent at the door by the interviewer.
Interviewer training emphasizes techniques and approaches effective in communicating with the older and disabled population and ways to overcome difficulties respondents may have in participating.
Individualized non-response letters are sent to respondents who refuse to participate (example included in Attachment 2). These letters are used when deemed appropriate by the field management staff.
NORC field management staff are specialized to follow up with respondents who express concerns about participating due to privacy or confidentiality questions.
Proxy respondents are sought for respondents unable to participate for themselves in order to keep respondents in the survey over the life of the panel.
Non-respondents are re-contacted by a refusal conversion specialist.
A dedicated project email address ([email protected]) and toll-free number (1-877-389- 3429) is available to answer respondent's questions. This information is contained on various materials provided to the respondent.
An MCBS website (mcbs.norc.org) contains information for respondents on the project. Respondents are also informed about the CMS MCBS Project Page – www.cms.gov/mcbs
Respondents receive an annual MCBS newsletter, which includes information about the survey as well as seasonal topics such as winter safety tips for seniors. Attachment 2 contains an example of a recent newsletter.
Whenever possible, the respondent is paired with the same interviewer throughout the survey. This maintains rapport and establishes continuity of process in the interview.
Interviewers are trained to utilize personal touches such as thank you notes and birthday cards to maintain contact with respondents.
A non-response bias analysis for the MCBS was conducted for the first time in 2017 and released as part of the 2015 MCBS Methodology Report16. An updated non-response bias analysis for the MCBS is underway based on the 2018 Panel and will be released in the final 2018 Methodology Report. While non-response is carefully monitored every year, a complete non-response bias analysis is updated every three years to ascertain trends both annually and for subpopulations.
Fall 2015 respondents and non-respondents were compared on various measures, including frame characteristics, Medicare claims payments, and chronic conditions, in order to identify areas of potential bias. The only statistically significant differences were found among frame characteristics. For the 2015 Panel, non-respondents appear more likely to be female and older, and slightly less likely to be non- Hispanic black. Among the continuing panels, however, non-respondents tend to skew younger. None of the differences is large in a practical sense. The weighting procedure includes a raking step that accounts for all of the frame characteristics for which differences were found. Thus, the small potential bias identified via these analyses is expected to be minimized by the weighting procedures. In contrast to most surveys, the MCBS has a large amount of information to characterize non- respondents. This information, including Medicare claims data, can be used for imputation if necessary.
Over the rounds, the following patterns of nonresponse have been observed, which have or have not changed over time. In the most recent three rounds for which a full analysis of response rates have been completed, the round-level response rates for continuing panels remains high, ranging from 80.0% for the 2015 panel in Round 76 to 96.0% for the 2012 panel in Round 75. Despite these high rates, each year continuing panels are subjected to a nonresponse adjustment based on new response propensity models by panel. Incoming panels at the first interview (e.g., the 2015 panel at Round 73) show a larger propensity for nonresponse due to having never been reached prior to the first interview. In Round 76 the response rate for the 2016 Incoming panel was 54.7%. Once again we rely on cells derived from response propensity models to account for differential effects of demographic and geographic characteristics on the resulting data. In 2016 the most closely related covariates to response propensity in the incoming panel were: the mean response rate over the previous 5 years in the same county; entitlement for Part B (2-level: yes, no); age category (7-level: under 45, 45 to 64, 65 to 69, 70 to 74, 75 to 79, 80 to 84, and 85 years or older); and tract-level median household income for households where the householder is at least 65 years of age (4-level: quartiles of median household income in the past 12 months, in 2015 inflation-adjusted dollars). By accounting for these characteristics in constructing the adjustment cells, we reduce the potential for nonresponse bias that could arise due to these differential factors.
Adaptive design methods have also been applied to measure the representativeness of the MCBS incoming sample. In 2017, CMS conducted a review of the Representativity Indicators (R- indicators) or metrics for the Fall 2017 Baseline interview to monitor the representativeness of the achieved sample. The R-indicators provided a quantitative assessment of which segments of the sample were over/under producing and causing the achieved sample to be imbalanced in terms of sample representativeness.
A sample R-indictor as well as two partial R-indicators (variable and category) are used to monitor representativeness of the panel. The variable R-indicator measures the representativeness of the sample associated with each variable (looking at the strength of each co-variate subpopulation such as race, ethnicity, age, sex, region) to predict response propensity. The category R-indicator then looks at the categories of each variable to measure representativeness of the responding sample.
In Fall 2016 and Fall 2017, R-indicators were not observed outside these thresholds; consequently, no data collection interventions were needed to improve the representativeness of the achieved sample. Use of R-indicators, along with a continual review of annual and historical response rates and non-response bias analysis are important tools in understanding response and ensuring that the sample as a whole, as well as subpopulations, are represented to produce high quality data.
MCBS’ generic clearance for Questionnaire Testing and Methodological Research for the MCBS was approved by OMB in May 2015 and received approval for an extension without change on May 18, 2018 (OMB No. 0938-1275, expiration 05/31/2021). The generic clearance encompasses development and testing of MCBS questionnaires, instrumentation, and methodological experiments. It contains approval for seven types of potential research activities:
1) cognitive interviewing, 2) focus groups, 3) usability testing, 4) field testing, 5) respondent debriefing questionnaire, 6) split ballot and other methodological experiments, and 7) research about incentives. Any future changes to the MCBS instrumentation, data collection methods, or procedures that require testing will be submitted as individual collection requests under the generic clearance.
The person responsible for statistical aspects of design is:
Edward Mulrow, Ph.D. Vice President
NORC at the University of Chicago
4350 East-West Highway, 8th Floor
Bethesda, MD 20814
(301) 634-9441
The contractor collecting the information is NORC at the University of Chicago.
1 Note that the historical target of 11,500 responding beneficiaries across all panels was not achievable in 2018; the target was reduced to 9,467, which was the maximum number of completed interviews achievable within budget.
2 Note that prior to 2017, 107 PSUs were used for sampling for the MCBS. These included three PSUs in Puerto Rico. Beginning in 2017, Puerto Rico was removed from the MCBS sampling frame.
3 Beginning in 2017, the 18 SSUs selected from the three Puerto Rico PSUs were removed from the sampling frame, leaving 685 SSUs for sampling for the MCBS.
4 For example, persons who became eligible for Medicare during 2015 could have incurred health care costs in 2015. By including such persons in the sampling process up to a year earlier than was done previously, they can be appropriately represented in the 2015 Cost Supplement File up to a year earlier.
5 Events and costs incurred after enrollment in Medicare but prior to the first interview.
6 Physical measures items in the PXQ section were migrated from the HFQ to a separate questionnaire section for ease of administration and data processing.
7 Cognitive measures items in the CMQ section were migrated from the HFQ to a separate questionnaire section for ease of administration and data processing.
8 Gahche, J.J., et al., Dietary Supplement Use Was Very High among Older Adults in the United States in 2011-2014. J Nutr, 2017. 147(10): p. 1968-1976.
9 Bandeen-Roche, K., et al., Frailty in Older Adults: A Nationally Representative Profile in the United States. J Gerontol A Biol Sci Med Sci, 2015. 70(11): p. 1427-34.
10 Sasaki H, Kasagi F, Yamada M, and Fujita S. 2007. “Grip Strength Predicts Cause-Specific Mortality in Middle-Aged and Elderly Persons.” The American Journal of Medicine, 120: 337-342.
11 Roberts HC, Denison HJ, Martin HJ, Patel HP, Syddall H, Cooper C, Sayer AA. 2011. “A Review of the Measurement of Grip Strength in Clinical and Epidemiological Studies: Towards a Standardised Approach. Age and Ageing, 40(4):423-429. doi: 10.1093/ageing/afr051
12 Fried, L.P., et al., Frailty in older adults: evidence for a phenotype. J Gerontol A Biol Sci Med Sci, 2001. 56(3): p. M146-56.
13 Gershon R.C., Wagster M.V., Hendrie H.C., Fox N.A., Cook K.F., Nowinski C.J. (2013). NIH Toolbox for Assessment of Neurological and Behavioral Function. Neurology; 80 (11 Supplement 3).
14 Reuban D.B., Magasi S., McCreath H.E., Bohannon R.W., Wang YC., Bubela D.J., Rymer W.Z., Beaumont J., Rine R.M., Lai JS., Gerson R.C. (2013). Motor assessment using the NIH Toolbox. Neurology; 80 (11 Supplement 3).
15 Beaumont J.L., Havlik R., Cook K.F., Hays R.D., Wallner-Allen K., Korper S.P., Lai JS., Nord C., Zill N., Choi S., Yost K.J., Ustsinovich V., Brouwers P., Hoffman H.J., Gerson R. (2013). Norming plans for the NIH Toolbox. Neurology; 80 (11 Supplement 3).
16 https://www.cms.gov/Research-Statistics-Data-and-Systems/Research/MCBS/Codebooks-Items/2015_MCBS_Methods_Report
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | CMS-P-0015A (0935-0568) Supporting Statement B |
Subject | OMB supporting statement |
Author | NORC |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |