CMS-P-0015A_0935-0568_SSB_clean

CMS-P-0015A_0935-0568_SSB_clean.docx

Medicare Current Beneficiary Survey (MCBS) (CMS-P-0015A)

OMB: 0938-0568

Document [docx]
Download: docx | pdf

Supporting Statement B
For Revision of Currently Approved Collection:
Medicare Current Beneficiary Survey (MCBS)

Contact Information:

William S. Long

Contracting Officer’s Representative, Medicare Current Beneficiary Survey

Office of Enterprise Data and Analytics (OEDA)/CMS

7500 Security Boulevard, Mail Stop Mailstop B2-04-12

Baltimore, MD 21244

(410) 786-7927

[email protected] (410) 786-5515 (fax)



January 29, 2021

Table of Contents


List of Attachments

Attachment 1: 60-day Federal Register Notice

Attachment 2: Community Advance Letter (for in-person interviews) – English

Community Advance Letter (for telephone interviews) – English

Community Advance Letter (for COVID-19 Supplement only) – English

Community Advance Letter (for NGACO Sample only) – English

MCBS Community Brochure (for in-person interviews) – English

MCBS Community Brochure (for telephone interviews) – English

At the Door Sheet – English

MCBS Calendar – English

Income and Assets (IAQ) Brochure – English

Community Authority Letter

Community Authority Letter (NGACO Sample only)

CMS Thank You Letter (Community) – English

MCBS Respondent Newsletter

Non-response letter – Continuing – English

Reminder Advance Letter (for telephone interviews) – English

Reminder Postcard (for in-person interviews) – English

Reminder Postcard (for telephone interviews) – English

Reminder Letter (for in-person interviews) – English

Reminder Letter (for telephone interviews) – English

Reminder Letter (for NGACO Sample only) – English

Attachment 3: Community Instrument (Baseline and Continuing) and Showcards

Attachment 4: Facility Eligibility Screener

Attachment 5: Facility Instrument (Baseline and Continuing) and Showcards

Attachment 6: Facility Advance Letter (for in-person interviews) – English

Facility Advance Letter (for telephone interviews) – English

MCBS Facility Brochure – English

Resident Consent Form

Next of Kin Consent Form

HIPAA Letter – English

Attachment 7: CAPI Screenshots of Introductory Screen and Thank You Screen

Attachment 8. COVID-19 Summer 2020 Supplement Test Report

Attachment 9. Crosswalk of Changes from Summer 2020 COVID-19 Supplement to Fall COVID-19 Supplement

B. Statistical Methods

The revision to this OMB package includes the following modifications to the Community instrument sections:

  • Add COVID-19 questions to the Community and Facility instruments to collect vital information about the impact of the pandemic on Medicare beneficiaries. Also administer these items in Winter 2021 to a separate cohort of Medicare beneficiaries aligned to a provider that participates in the Next Generation Accountable Care Organization (NGACO) Model.

  • Revise the Housing Characteristics Questionnaire (HAQ) to add two items about housing insecurity.

  • Revise the Health Status and Functioning Questionnaire (HFQ) to add one item about social isolation.

B1. Universe and Respondent Selection

The target universe is current Medicare beneficiaries entitled to hospital and/or supplementary medical insurance and living in the 50 states or the District of Columbia. Both institutionalized and non-institutionalized beneficiaries are represented. Table B.1 summarizes the number of beneficiaries in the target universe based on CMS administrative records through 2019 and projected estimates for 2020. The seven age groups shown in the table correspond to the primary sampling strata from which the samples for the MCBS are drawn. The age groups are defined by the beneficiaries’ age as of July 1 of the given year for 2015, and as of December 31 of the given year for 2016 and later.

Table B.1: Universe Counts Broken Down by MCBS Age Groups (in thousands)

Age Interval

2015

2016

2017

2018

2019

2020 (est.)

Disabled <45

1,938.78

1,888.80

1,842.08

1,791.78

1,771.52

1,751.49

45 to 64

7,207.86

7,150.16

7,076.64

6,903.46

6,773.12

6,645.24

Total

9,146.64

9,038.96

8,918.72

8,695.24

8,544.64

8,396.73

Aged







65 to 69

15,312.60

15,727.66

15,767.28

15,978.62

16,368.74

16,768.38

70-74

11,640.90

12,401.12

13,080.94

13,647.66

14,322.88

15,031.51

75-79

8,314.00

8,607.10

9,080.94

9,463.14

9,820.30

10,190.94

80-84

5,999.42

6,069.32

6,137.60

6,301.04

6,441.96

6,586.03

85+

7,045.62

6,976.84

7,021.14

7,001.80

7,052.58

7,103.73

Total

48,312.54

49,782.04

51,087.90

52,392.26

54,006.46

55,680.59

Total

57,459.18

58,821.00

60,006.62

61,087.50

62,551.10

64,077.32

Source: Universe counts are based on a 5-percent extract of the Medicare administrative records and are computed as 20 times the extract counts.

Notes: Puerto Rico beneficiaries are excluded from counts beginning in 2017 by sample design. Projections (2020) from the historical counts are based on the annual rate of change from 2017-2019.

Totals do not necessarily equal the sum of rounded components.

The target sample size of the MCBS has been designed to yield 9,9961 completed cases providing 2019 Cost Supplement data per year (approximately 800-900 disabled enrollees under the age of 65 in each of two age strata, and 1,400-1,700 enrollees in each of five age strata for enrollees 65 and over).

To achieve the desired number of completed cases, the MCBS selects new sample beneficiaries each year (referred to as the incoming panel) to compensate for nonresponse, attrition, and retirement of sampled beneficiaries in the oldest panel (referred to as the exit panel) and to include the current-year enrollees, while continuing to interview the non-retired portion of the continuing sample. The incoming panel is always added in the Fall round (also referred to as the baseline interview); the retiring or exit panel occurs in the winter round (and is the 11th and final interview for all respondents).

Each year, an analysis of non-response and attrition is conducted to determine the optimal sample size for the fall round incoming panel. Through 2009, approximately 6,500 beneficiaries were added to the sample in the fall (September – December) round each year to replace the exiting panel and to offset sample losses due to non-response and attrition. Beginning in the fall round of 2010, the number of beneficiaries included in the incoming panel was increased to approximately 7,400 to compensate for declining response rates. Over the decade, the incoming panel sample has gradually increased to approximately 11,500. The sample size results in about 36,000 interviews completed per year.

Proxy interviews are attempted for deceased sample persons. If data are collected through the date of death, then such cases are counted as completes. Sampled beneficiaries remain in the survey when they are unavailable for an interview in a given round; that is, they are carried forward into the next round. For these individuals, the reference period for their next interview is longer as it covers the period since their last interview; this ensures that there will not be a gap in coverage of utilization and expenditure data. If a sampled beneficiary is not interviewed for two consecutive rounds, they are not scheduled for any further interviews and are taken out of case management. Such cases are treated as nonresponding cases.

The methodology for drawing the samples is described later in this document. The number of cases to be selected each year for the incoming panel (designated sample sizes) are larger than the targeted number of completes to compensate for non-response, ineligibility, and attrition. To see an illustration of the extent of the compensation necessary in Fall 2019 Round 85 to achieve the desired number of cases providing annual data, see Table B.2.

Table B.2: Sample Size Needed to Compensate for Initial Non-Response and Ineligibility in the 2019 Fall Round

Table B.2: Sample Size Needed to Compensate for Initial Non-Response and Ineligibility

Age on December 31 of reference year

Desired average number of cases providing annual data

Number sampled at Fall 2019 Round 85

18-44

343

1,218

45-64

332

9,20

65-69

687

2,297

70-74

600

1,642

75-79

603

1,745

80-84

620

1,856

85+

648

1,942

Total

3,833

11,620

Cross-sectional sample sizes for other domains. There are multiple domains of interest in the MCBS, (for example, respondents with end-stage renal disease, persons residing in nursing homes, managed care enrollees, beneficiaries of various race and ethnic backgrounds, Medicaid recipients, and beneficiaries aligned to a provider participating in accountable care organizations). The MCBS will continue to maintain a minimum target of 9,000 completed responses in the annual Cost Supplement file to ensure that analysis can be performed on MCBS data for many domains of interest.

Next Generation Accountable Care Organizations Cohort

An additional sample of Medicare beneficiaries aligned to Next Generation Accountable Care Organizations will be drawn in Winter 2021 Round 89 to receive the COVID-19 items during a one-time data collection effort. Beneficiaries will be selected from the Medicare administrative enrollment data. This additional sample will exclude any people who have previously participated in the MCBS. A stratified simple random sample will be selected, where the stratification will be by age group (<65, 65-74, 75-84, 85+). To ensure a sufficient sample within each age group, CMS will target 1,250 completed surveys within each age group, for a total of 5,000 completed surveys. All interviews will be completed by phone. This target allows us to achieve a 2.7% margin of error for a 95% confidence interval within each age group, assuming an outcome centered around 50%. To achieve the overall number of completes and an assumed response rate of 28.0%, a sample of 17,782 will be selected for fielding during the Winter 2021 Round 89 data collection period.


The assumed response rate of 28.0% is based on a few key parameters. First and foremost, we plan for an abbreviated data collection period of approximately ten weeks of fielding, compared to 24 weeks for a typical round with incoming MCBS respondents. Additionally, this survey will only be fielded by phone, and we are reliant upon vendor phone matching to attain phone numbers for the sampled beneficiaries. Not only does in-person field interviewing allow for a significant boost in response rates compared to phone interviews, but the sample is dependent upon locating services finding usable and accurate phone numbers. Lastly, during prior rounds of MCBS, those under 65 years of age and those over 85 years of age tend to respond at lower rates compared to those aged 65-74 and 75-84. Because we are targeting equal numbers of completed interviews in each age group, these lower responding age groups have more of an impact on the overall expected response rate for the NGACO sample compared to the general MCBS sample, where the age distribution is more proportional to the population. Our targeted response rate of 28% takes into account each of these components. As described in Section B.3, a non-response bias analysis will be completed for this cohort in the complete non-response bias analysis planned for the 2020 data collection year. Representativity Indicators (R- indicators) will also be used to measure representativeness during fielding.


Sample sizes for longitudinal analyses. Beginning in 2018, under the rotating panel design specified for the MCBS, respondents remain in the sample for up to eleven rounds of data collection over a four year period; prior to 2018, respondents remained in the sample for up to twelve rounds of data collection. The historical response rates and attrition rates observed in the MCBS are used to determine the rotational sample size and configuration of each new incoming panel. The rotational sample design attempts to achieve consistency in subgroup sample sizes across all panels comprising a particular calendar year.


Table B.3 (in section B2 below) presents the round-by-round conditional and unconditional response rates as of Round 82 (Fall round of 2018) for the samples (referred to in the table as “panels”) selected in 2012 through 2018. For example, from the bottom part of the table, it can be seen that by the 10th round of data collection for the 2015 panel, 21.7 percent of the 2015 panel were still in a formal responding status (that is, either the sampled beneficiary was alive and still participating in the study or had died but a cooperative proxy was found for the collection of data on the last months of life) or had participated in the survey until death, leaving enough data to estimate the last months of life. For the 2016 and 2017 panels, the unconditional response rates as of Round 82 were 25.4 percent (through the 7th round of data collection) and 33.7 percent (through the 4th round of data collection), respectively. The 2018 panel (the new panel selected in Round 82) had an initial response rate of 55.9 percent in its first round of data collection.

Round 82 (Fall 2018) is the latest round for which MCBS data have been fully processed. There were 1,732 interviews successfully completed at Round 82 with still-living members of the 2015 panel. For brevity, we refer to these 1,732 interviews as “live completes.” For the 2016 and 2017 panels there were 2,882 and 3,677 live Round 82 completes, respectively. For the first round of data collection for the 2018 panel, there were 6,156 completes at Round 82.

The MCBS has used a variety of techniques to maintain respondents in the survey and reduce attrition. These will be continued and adapted to comply with the time frames for initiating and implementing the continuing sample.

B2. Procedures for Collecting Information

This section describes the procedures used to select the samples for the national survey. It includes a general discussion of the statistical methodology for stratification and rotational panel selection, estimation procedures, and the degree of accuracy needed. This is followed by a presentation of how instrument sections are used to enhance the analytic potential of the MCBS data. Finally, there is a discussion of rules for allowing proxy response.

  1. Statistical Methodology for Stratification and Sample Selection

This section opens with a description of the MCBS sample design. This is followed by a general discussion of the selection of the original and annual new incoming samples and the use of Medicare administrative enrollment data each year to reduce problems associated with duplication of samples across the years.

  1. PSU and Census tract clustering. The MCBS employs a complex multistage probability sample design. At the first stage of selection, the sample consists of 1042 primary sampling units (PSUs) defined to be metropolitan areas and clusters of nonmetropolitan counties. At the second stage of selection, samples of Census tracts are selected within the sampled PSUs. At the third and final stage of selection, stratified samples of beneficiaries within the selected Census tracts are sampled at rates that depend on age group and ethnicity.

The strata used for selection of the PSUs covers the 50 states and the District of Columbia. Since PSUs were selected randomly with probabilities proportionate to size, there are some states without any sample PSUs within their boundaries. Within major strata defined by region and metropolitan status, PSUs were sorted by percent of beneficiaries enrolled in HMOs and/or percent of beneficiaries who are minorities based on data in CMS administrative files. Substrata of roughly equal size were created from the ordered list for sample selection.

In 2014, within the PSUs, a sample of 703 second-stage units (SSUs) consisting of Census tracts or clusters of adjacent tracts was selected. There were several steps in the SSU sampling process. First, an extract of the entire Medicare administrative enrollment data was obtained, and all beneficiaries’ addresses were geocoded to the tract level. A minimum measure of size was used to determine whether a Census tract was large enough (i.e., had enough Medicare beneficiaries) to stand on its own as an SSU or would need to be combined with one or more adjacent tracts. A frame of 24,212 SSUs was then constructed, and a sample of 703 SSUs was selected using systematic probability proportional to size. These SSUs have been used for sampling MCBS beneficiaries since 20143 and were sized to be used for up to 20 years. An additional sample of 339 reserve SSUs was also selected to support an expansion of the sample or the study of special rare populations in future years. To date, these reserve SSUs have not yet been used for sampling for the MCBS.



Table B.3: Conditional and Unconditional Response Rates as of the 2018 Panel for Medicare Current Beneficiary Survey by Interview Round

Conditional Response Rates (%) for Medicare Current Beneficiary Survey by Interview Round

Round

2012 Panel
(n=7400)

2013 Panel
(n=7400)

2014 Panel*
(n=11398)

2015 Panel
(n=8621)

2016 Panel
(n=12145)

2017 Panel
(n=11623)

2018 Panel
(n=11523)

Round 1

73.2

72.8

58.7

53.3

54.7

55.3

55.9

Round 2

87.6

87.4

***

83.2

81.4

79.9

-

Round 3

92.4

92.1

82.1

82.7

83.9

83.1

-

Round 4

92.3

78.5

84.1

80.0

84.2

85.1

-

Round 5

94.3

***

85.9

88.3

87.9

-

-

Round 6

94.3

86.9

81.1

88.0

87.7

-

-

Round 7

80.7

87.6

83.4

87.7

88.1

-

-

Round 8

***

89.8

91.1

91.5

-

-

-

Round 9

89.8

82.2

89.7

92.0

-

-

-

Round 10

90.1

87.9

90.3

91.9

-

-

-

Round 11

93.1

94.4

96.2

-

-

-

-

Round 12

96.0

97.2

-

-

-

-

-

Unconditional Response Rates (%) for Medicare Current Beneficiary Survey by Interview Round

Round

2012 Panel
(n=7400)

2013 Panel
(n=7400)

2014 Panel*
(n=11398)

2015 Panel
(n=8621)

2016 Panel
(n=12145)

2017 Panel
(n=11623)

2018 Panel
(n=11523)

Round 1

73.2

72.8

58.7

53.3

54.7

55.3

55.9

Round 2

63.9

63.4

***

44.2

44.3

43.7

-

Round 3

58.6

57.9

48.1

31.7

38.1

37.7

-

Round 4

53.5

44.8

40.1

32.9

33.3

33.7

-

Round 5

50.1

***

35.8

31.3

29.0

-

-

Round 6

46.4

42.1

21.9

28.1

27.5

-

-

Round 7

37.2

36.6

28.4

25.6

25.4

-

-

Round 8

***

33.6

27.1

23.0

-

-

-

Round 9

35.5

20.2

24.6

22.7

-

-

-

Round 10

31.8

28.6

23.2

21.7

-

-

-

Round 11

30.5

28.0

23.0

-

-

-

-

Round 12

27.4

25.3

-

-

-

-

-

* The 2014 panel response rate was impacted by several operational design changes recognized during the transition between contractors in 2014, including an extensive CAPI instrument development effort originally considered out-of-scope for transition purposes, the initial need to release a larger 2014 incoming panel sample to account for a smaller continuing sample fielded in the fall of 2014, the hiring and training of 100 new interviewers for MCBS data collection, and the decision to extend the incoming panel data collection through the release of additional replicates in December 2014, resulting in a shorter data collection period and consequently lower response rate for 2,500 sample members.

*** Not available because the 2015 winter and summer rounds (R71 and R72) were combined for data collection in this year only. Again, this was due to transition activities that started in 2014 and were completed in 2015.

In rounds where some cases are intentionally not fielded, unconditional response rates will be lower than they would have been if all eligible cases were fielded. For example, some cases were intentionally not fielded in Summer 2016 (Round 75) and Winter 2018 (Round 80). The grey highlighted rates reflect the rounds and panels in which this field strategy was used. In Summer 2016 (Round 75), some cases were intentionally not fielded and instead were included in an early case release for Fall 2016 (Round 76). The resulting unconditional response rates for the 2013-2015 panels in the 9th, 6th, and 3rd rounds, respectively, were lower than they would have been had the cases been fielded, but increased again in the subsequent rounds. In Winter 2018 (Round 80), a group of 306 cases was intentionally not fielded as part of a strategic NIR experiment.

  1. Selection of beneficiaries. In the Fall 2019 Round 85, an incoming panel sample of 11,620 beneficiaries was selected from the Medicare administrative enrollment data. This sample was clustered within the selected PSUs and SSUs and was designed to achieve uniform sampling weights within each strata. Beginning in 2015, beneficiaries eligible anytime during the sampling year are also included in the Medicare administrative enrollment sampling frame (referred to as current-year enrollees). Their inclusion allows for the release of data files up to one year earlier than previously possible.4 Also beginning in 2015, Hispanic beneficiaries living outside of Puerto Rico were oversampled. Nursing home residents are drawn into the sample in exactly the same manner as other beneficiaries residing in the community.

  1. Estimation Procedure

To date, sampling weights have been calculated for each Fall round (1, 4, 7…, and 85) in order to produce the Survey File limited data sets (previously referred to as the Access to Care files), and for each calendar year in order to produce the Cost Supplement limited data sets (previously referred to as the Cost and Use files) In both cases, cross-sectional and longitudinal weights have been calculated. Some questionnaire sections fielded in the Winter or Summer rounds have specific cross-sectional weights calculated for them as well. In all cases, weights reflect differential probabilities of selection and differential nonresponse, and are adjusted to account for overlapping coverage of the panels included in the data files. Replicate weights were also calculated so that users can calculate standard errors using replication methods. In addition to the replicate weights, stratum and unit codes exist on each weight file for users who prefer to use Taylor Series methods to estimate variances.

Besides standard weighting and replicate weighting, another part of the estimation program includes the full imputation of the data sets to compensate for item non-response. Imputation of charges for non-covered services and sources of payment for covered services in the Cost Supplement files have been developed. Beginning with the 2015 data, unit-level imputation was also instituted to compensate for missing initial-round utilization and cost data5 for current-year enrollees. The weighting and imputation of data continue each year.

  1. Degree of accuracy needed for the purpose described in the justification

A broad range of statistics are produced from the MCBS. There is no single attribute of beneficiaries and their medical expenses that stands out as the primary goal of the survey. Thus, there can be no simple criterion for the degree of reliability that statistics for each analytic domain should satisfy. Even with a larger sample size of 14,000 to 15,000 persons, there would be many small domains of interest for which it would be necessary to use modeling techniques or to wait several years for sufficient data to accumulate.

The MCBS will maintain a stratified approach to the selection of the sample. The sample will continue to be clustered by PSU and Census tract-based SSU and stratified by age domain and race/ethnicity; the tract-based SSU approach was an innovation first begun in 2014 which has resulted in greater efficiencies and increased analytic opportunities. We anticipate maintaining a total of 700-900 annual cases allocated to the two younger age categories for disabled beneficiaries who are not yet 65. The two age categories were selected because they indirectly reflect the means by which the disabled person becomes eligible for Medicare. Since the number of disabled sample persons per PSU and Census tract will be small, the effects of clustering on statistical precision should be mild for this subgroup. For example, depending on the prevalence of the characteristic being estimated, the MCBS has achieved standard errors for estimates of percentages ranging from 2-3% or lower for subgroup estimates based on 1,000 respondents.

Since many of the cost and reimbursement statistics derived from the MCBS may be heavily right-skewed (i.e., reflecting the higher end of the cost/reimbursement spectrum to a disproportionate degree), the accuracy may be lower in relative terms but still acceptable. For example, the relative standard error of the mean total Medicare reimbursements derived from the MCBS has generally ranged from 2.0-2.5% for the total sample, and 4.0-8.0% for subgroups.

Each of the age strata for the Medicare sample age 65 and over will be allocated 1,200-1,700 cases, with the oldest stratum (age 85 and over) being allocated about 1,600 cases with oversampling. A major reason for oversampling the very old is to obtain an adequate sample of nursing home stays. Variations in sampling weights across the age strata and clustering within PSU and Census tract will inflate sampling errors, but the resulting effective sample sizes should be adequate for most analyses.

  1. Interview content for periodic data collection cycles to reduce burden.

  1. Content and timing of instrument sections.

The primary variables of interest for the MCBS are the use and cost of health care services and associated sources and amounts of payment. While Medicare claims files supply information on billed amounts and Medicare payments for covered services, the survey provides important self-reported information on use of services not covered by Medicare and on payment sources and amounts for costs not reimbursed by Medicare. For both the Community and Facility components, the primary focus of the data collection is on use of services (dental, hospital, physician, medical providers, prescription medication and other medical services), sources and amounts of payment, and health insurance coverage. The MCBS interview collects continuous information on these items through thrice-yearly interviews; that is, once a new respondent completes their baseline interview, they are asked utilization and cost questions each round.

Continuous data on utilization and expenditures are required for a number of reasons. First, several of the distinct expenditure categories involve relatively rare medical events (inpatient hospital stays, use of home health care, purchase of durable medical equipment, and so forth), so limiting the reference period would mean insufficient observations for annual estimates. Second, episodes of medical care often consist of a series of services over weeks or months; data collected several times a year allow examination of the grouping of services and costs around particular episodes of care. Third, payment for medical services often occurs considerably later than the utilization, so collection of complete information about a particular event can often only be obtained sometime after the event occurs.

The administration of the instruments will continue to follow the established pattern of data collection. Baseline information will be collected in the initial interview with new incoming panel respondents. This will be followed with 10 interviews to collect utilization, cost and other important topics. Since the initial interview always occurs in the last four months of a calendar year, collection of utilization and expenditure data in the second interview means the reference period will always begin prior to January 1st. This creates use and expenditure estimates on a calendar year basis.

The literature (initially reported by Neter and Waksberg in 1964, and confirmed in subsequent research by other analysts) indicates that collection of behavioral information in an unbounded recall period can result in large recall errors. The incoming panel interviews covered in this clearance request - Fall 2021 (Round 91), Fall 2022 (Round 94), and Fall 2023 (Round 97) -prepares the respondent for the collection of utilization and expenditure information in subsequent rounds, thus “bounding” the recall period for the next interview. During the baseline interview, the respondent is provided with a calendar and interviewers emphasize the importance of this tool for use in future interviews. This calendar marks the recall period for the respondent and serves as the means to record utilization as well as a prompt to retain statements and bills.

  1. Content of the instruments, Rounds 89-97.

Nearly all of the instruments sections as currently approved by OMB are unchanged. Table B.4 presents the core and topical sections that comprise the MCBS Community instrument. As shown in the table, the content and order of administration varies based on season of data collection (Fall, Winter, Summer) and the type of interview (Baseline, Continuing). Those sections with an asterisk (*) include a revision contained in this clearance request (either adding or deleting questions). Occasionally an item may be moved from one questionnaire section to another to improve the flow and use of the data, or for other operational or analytic purposes.

Table B.4: Community Instrument Sections and Order of Administration

Section

Listed in the order in which the section is administered.

Type of Section (Core or Topical)

Season of Administration

(Rounds Administered)

Interview Type (Baseline, Continuing, Both)

Introduction (INQ)

Core

All (Round 89-97)

Both

Enumeration (ENS)

Core

All (Round 89-97)

Both

Housing Characteristics (HAQ)*

Topical

Fall (Rounds 91, 94, 97)

Both

Health Insurance (HIQ)

Core

All (Round 89-97)

Both

Dental, Vision, and Hearing Care Utilization (DVH)

Core

All (Round 89-97)

Continuing

Emergency Room Utilization (ERQ)

Core

All (Round 89-97)

Continuing

Inpatient Utilization (IPQ)

Core

All (Round 89-97)

Continuing

Outpatient Utilization (OPQ)

Core

All (Round 89-97)

Continuing

Institutional Utilization (IUQ)

Core

All (Round 89-97)

Continuing

Home Health Utilization (HHQ)

Core

All (Round 89-97)

Continuing

Medical Provider Utilization (MPQ)

Core

All (Round 89-97)

Continuing

Access to Care (ACQ)

Core

Winter (Rounds 89, 92, 95)

Continuing

Prescribed Medicine Utilization (PMQ)

Core

All (Round 89-97)

Continuing

Other Medical Expenses (OMQ)

Core

All (Round 89-97)

Continuing

Statement Cost Series (STQ)

Core

All (Round 89-97)

Continuing

Post-Statement Cost (PSQ)

Core

All (Round 89-97)

Continuing

No Statement Cost Series (NSQ)

Core

All (Round 89-97)

Continuing

Cost Payment Summary (CPS)

Core

All (Round 89-97)

Continuing

Mobility of Beneficiaries (MBQ)

Topical

Fall (Rounds 91, 94, 97)

Both

Preventive Care (PVQ)

Topical

All (Round 89-97)

Both

Health Status and Functioning (HFQ)*

Core

Fall (Rounds 91, 94, 97)

Both

Physical Measures (PXQ)

Core

Fall (Rounds 91, 94, 97)

Baseline

Chronic Pain (CPQ)

Topical

Summer (Rounds 90, 93, 96)

Continuing

Physical Measures (PXQ)

Core

Summer (Rounds 90, 93, 96)

Continuing

Nicotine and Alcohol Use (NAQ)

Topical

Fall (Rounds 91, 94, 97)

Both

Satisfaction with Care (SCQ)

Core

Fall (Rounds 91, 94, 97)

Both

Demographics and Income (DIQ)

Core

Fall (Rounds 91, 94, 97)

Baseline

Beneficiary Knowledge and Information Needs (KNQ)

Topical

Winter (Rounds 89, 92, 95)

Continuing

Usual Source of Care (USQ)

Core

Winter (Rounds 89, 92, 95)

Continuing

Income and Assets (IAQ)

Core

Summer (Rounds 90, 93, 96)

Continuing

Drug Coverage (RXQ)

Topical

Summer (Rounds 90, 93, 96)

Continuing

Cognitive Measures (CMQ)

Core

Fall (Rounds 91, 94, 97)

Both

COVID-19 (CVQ)*

Topical

All (Round 89-97)6

Both

End Section

Core

All (Round 89-97)

Both

The Facility instrument collects information that is similar in content to the Community instrument. Table B.5 presents the sections that comprise the MCBS Facility instrument; all sections are considered core. As with the Community instrument, the content and order of administration varies based on season of data collection (Fall, Winter, Summer) and the type of interview (baseline, continuing).

Table B.5: Facility Instrument Sections and Order of Administration

Section

Season of Administration (Rounds Administered)

Interview Type (Baseline, Continuing, Both)

Facility Questionnaire (FQ)

All (Round 89-97)

Both

Residence History (RH)

All (Round 89-97)

Both

Background Questionnaire (BQ)

Fall (Rounds 91, 94, 97)

Baseline

Health Insurance (IN)

All (Round 89-97)

Both

Use of Health Services (US)

All (Round 89-97)

Continuing

Expenditures (EX)

All (Round 89-97)

Continuing

Health Status (HS)

Fall (Rounds 91, 94, 97)

Both

COVID-19 Questionnaire (CV)*

All (Round 89-97)7

Both

Facility Questionnaire Missing Data^

All (Round 89-97)

Both

Residence History Missing Data^

All (Round 89-97)

Both

Background Questionnaire Missing Data^

Fall (Rounds 91, 94, 97)

Baseline

^Section only activated and available for administration when critical data points from the FQ, RH, or BQ sections are marked as missing, Don’t Know, or Refused.

The revision to this OMB package includes the following content changes to the Community instrument.

Summary of instrument changes beginning in Winter 2021 Round 89 through Fall 2023 Round 97:
  • Add COVID-19 questions to the Community and Facility instruments to collect vital information about the impact of the pandemic on Medicare beneficiaries. Administer these items to a separate cohort of Medicare beneficiaries aligned to a provider that participates in the Next Generation Accountable Care Organization Model in Winter 2021.

  • Revise the Housing Characteristics Questionnaire (HAQ) to add two items about housing insecurity.

  • Revise the Health Status and Functioning Questionnaire (HFQ) to add one item about social isolation.

Add COVID-19 Questions to Community and Facility Instruments

With the emergence of the COVID-19 pandemic in the U.S., CMS was uniquely positioned to quickly collect vital information on how the pandemic is impacting the Medicare population by utilizing the MCBS. MCBS beneficiaries are at elevated risk for more severe COVID-19 complications. Since the MCBS has a sample size sufficient for estimation, it provides a ready source to obtain high quality data on this important population. Administrative records, such as Fee for Service (FFS) claims and Medicare Advantage Encounter data, only collect information about actual health care utilization and cost related to COVID-19 (i.e., services rendered and received by the beneficiaries). These administrative records do not provide data about care not received or offered or the ability of beneficiaries to receive such services by particular service delivery methods. The impact of COVID-19 on the lives of Medicare beneficiaries, such as availability of telehealth services among their providers, the ability to receive telehealth services by the beneficiaries, deferred medical care due to the pandemic, and consequences for their behavior and well-being cannot be measured by the administrative records. The collected information allows CMS to assess the impact of forgone care, the differential impact of new ways of delivering services to beneficiaries, as well as what services may not be offered by their providers. MCBS collects information on all types of healthcare, not just care covered by Medicare in order to measure the full cost and impact of healthcare delivery system changes on the beneficiary. Additionally, administrative data do not have detailed covariates to understand the differential impact of COVID-19 on minority and disadvantaged/low income populations.

As described in Supporting Statement A, on August 7, 2020, CMS received emergency OMB approval (0938-1379) for COVID-19 items to be administered by telephone during Fall 2020 Round 88 as a Supplement to the main MCBS. Previously, these items were tested in Summer 2020 under CMS-10549 GenIC#7 MCBS COVID-19 Rapid Response Supplement Testing, which was approved by OMB on May 7, 2020 under the MCBS Generic Clearance (0938-1275). The methods and questionnaires used for the Fall COVID-19 Supplement were mostly the same as those used in the COVID-19 Supplement Test with two main differences: (1) some terminology and questions were changed to align with other Federal surveys or to meet additional needs of CMS and CDC collaborators and (2) some questions have been added for Facility administration. Please refer to Attachment 8 for a report on the Summer 2020 COVID-19 Supplement Test and Attachment 9 for a full crosswalk of changes documenting the differences between the Summer 2020 COVID-19 Supplement to the Fall 2020 COVID-19 Supplement items.

As indicated in the justification to the emergency clearance, starting in Winter 2021 Round 89, CMS plans to continue fielding the COVID-19 items each round in the Community and Facility instruments, as long as it is relevant to do so, depending on the trajectory of the pandemic. As with other questionnaire sections, the Community and Facility COVID-19 questions will be contained in a COVID-19 Questionnaire (CVQ) and will contain parallel items tailored to each data collection setting. For the CVQ in the Community questionnaire, the questionnaire content will be specific to the impact of COVID-19 on the respondent’s life, such as availability of telehealth, deferred medical care, COVID-19 testing and health consequences, and the impact of the pandemic on their behavior and well-being. Working with the Centers for Disease Control and Prevention, questions about vaccine uptake or likelihood of getting a vaccine have also been added. These items include important data points, such as dosage and dates of vaccination receipt, which are not covered by COVID-19 vaccination items found on other Federal surveys, such as the Census Bureau’s Household Pulse Survey8. Further, comprehensive COVID-19 vaccination data for a representative sample of the Medicare population are not available from claims data. Fielding these items via the MCBS allows for the collection of comprehensive, representative COVID-19 vaccination estimates that can be analyzed alongside other key covariates released in the MCBS Limited Data Set files and Public Use Files. Inclusion of questions on the MCBS about the month/year date of COVID-19 vaccination and number of doses received will allow evaluation of series completion by detailed race/ethnicity and other sociodemographic and health care access related factors not available with the doses administered data provided by vaccinators to states or CDC or with the CMS administrative FFS claims and MA encounter data. COVID-19 vaccine providers are required to report limited sociodemographic information (i.e., age, sex, race/ethnicity), but reporting of race/ethnicity is incomplete or missing, and other sociodemographic factors such as income and health care access is only available on the MCBS. The collection of these data points on the MCBS allows for evaluation of the association of COVID-19 vaccination with a much broader set of sociodemographic and health care access variables than would be available with Medicare administrative data. Additionally, having data on timing of vaccination will help in pooling data over time for more detailed analysis. Addressing inequities in COVID-19 vaccination administration is a priority as outlined in the new administration’s COVID-19 national strategy9.

For the COVID-19 Questionnaire administered in facility interviews, the questionnaire content includes several facility-level measures covering the following topics: suspension of health services (both covered and non-covered Medicare services including vision, hearing, and dental services); use of telemedicine; measures to prevent and control the spread of COVID-19 at the facility; changes in staffing and providers; and efforts to address mental health and loneliness among residents. These topics were requested by CMS’ Chief Medical Officer to assess key ways in which COVID-19 has impacted facilities that serve Medicare beneficiaries; this information is not available from other sources, particularly since they encompass facility-level metrics that cannot be assessed by individual-level data sources such as claims data. Further, while some information about COVID-19 infections, COVID-19 testing, and personal protective equipment and hand hygiene supplies at facilities are required by the National Healthcare Safety Network (NHSN), this information is not redundant with items asked in the MCBS COVID-19 Supplement. The NHSN reporting requirements only apply to nursing homes and do not include other facility types, such as assisted living facilities. MCBS Facility data collection encompasses any facility type where MCBS beneficiaries may reside, including both Medicare-certified nursing homes and non-Medicare-certified assisted living facilities, group homes, etc. which are not required to report COVID-19 information to the NHSN10. There are also several beneficiary-level topics, similar to the community questionnaire: COVID-19 testing and treatment; services with additional provider types due to COVID-19 diagnosis; CDC COVID-19 vaccine items; and mental health (e.g., Patient Health Questionnaire or PHQ-9). As is always done on the MCBS, facility data collection is conducted with facility staff knowledgeable about the facility’s protocols and the beneficiary’s health status.

In Winter 2021, CMS plans to leverage the MCBS data collection infrastructure to administer the COVID-19 items to a separate cohort of Medicare beneficiaries aligned to providers participating in a new type of accountable care organization (ACO) model launched by CMS in 2016, called the Next Generation ACO Model (NGACO Model). Beneficiaries who enroll in an NGACO benefit from enhanced care coordination. Prior to the outbreak of COVID-19 the NGACO Model offered ACOs flexibility in delivering telehealth services through the Telehealth Expansion Waiver from 2016 and onwards, making it the first shared savings model to do so. Other CMS Innovation Center models offered telehealth expansion waivers for specific episodes of care alone while for shared savings ACO models in CMS’ Pathways to Success, telehealth expansion was offered from the beginning of 2020. ACOs also are familiar with their enrollees from the beginning of the year, and use population heath strategies to proactively manage these beneficiaries' health. These population health management strategies include data analytics (to identify & target those with high-risk of hospitalizations) and care management (to address their patients' care needs). While the main MCBS sample includes some NGACO enrollees (due to random selection among all beneficiaries), administering the COVID-19 items in parallel to the main MCBS sample and a separate cohort of the NGACO population will enhance CMS’ ability to analyze differences for these two populations in a timely, comparable way for a variety of outcomes related to the COVID-19 pandemic. With care coordination and increased access to telehealth among the goals of the NGACO Model, the COVID-19 items present an opportunity to assess its success on those fronts. Further, these items provide an opportunity to descriptively compare experiences of beneficiaries in organizations with years of experience with care coordination and telehealth (such as NGACOs) with the experiences on beneficiaries in Fee for Service Medicare. Because COVID-19 items will be administered to a representative cohort, results from this one-time survey may be generalized to other NGACO enrollees.

The Community and Facility COVID-19 Questionnaires will be administered in Winter 2021 Round 89, Summer 2021 Round 90, and Fall 2021 Round 91, as long as it is relevant to do so. The administration of COVID-19 items to a separate cohort of NGACO enrollees will only be included during Winter 2021 data collection. The questions are the same as those approved by OMB on August 7, 2020 under the emergency clearance request (0938-1379).

Revise Housing Characteristics Questionnaire (HAQ) to Add Items about Housing Insecurity

During each fall round, the MCBS asks about housing characteristics. However, the section does not include questions about housing insecurity, which is important to consider in the context of public health. The Healthy People 2020 summary of housing insecurity suggests that people who are cost burdened (those who spend more than 30% of their income on housing) and those who are severely cost burdened (those who spend more than 50% of their income on housing) may be more likely to live in housing that includes mold exposure, inadequate heating or cooling systems, and environmental pollutants that pose health risks. Additionally, severely cost-burdened families are more likely to live in housing that includes overcrowd rooms or homes, which can affect mental health, stress levels, and sleep, as well as increase the risk of infectious diseases like COVID-19. In addition to a dwelling itself being a risk factor, the neighborhood where a dwelling is located can affect a person’s health and can lead to differences in outcomes such as prevalence of obesity and prevalence of diabetes. Working with the CMS Office of Minority Health (OMH), and in consultation with a Technical Expert Panel (TEP) including members from the U.S. Department of Housing and Urban Development (HUD), two housing insecurity questions from the Accountable Health Communities (AHC) Health-Related Social Needs Screening Tool are planned for inclusion in the MCBS beginning in Fall 2021 Round 91; these items will be administered in the HAQ each Fall round. The first item asks about the beneficiary’s living situation. The second item asks if the beneficiary’s place of residence has any problems with pests, mold, lead pint, lack of heat, lack of working oven or stove, lack of working smoke detectors, or water leaks. The questions have been demonstrated to have 97% sensitivity and 83% specificity in a validation study. The AHC Screening Tool was developed by a panel of interdisciplinary experts that reviewed evidence-based ways to measure Social Determinants of Health (SDOH), such as housing instability11. While other SDOH included on the AHC Screening Tool, such as food insecurity, transportation barriers, and financial strain are already measured by existing items on the MCBS questionnaire, adding these two questions on housing insecurity will increase CMS’ ability to analyze health disparities in Medicare, which is critically important for quality improvement and responsiveness to public health emergencies like COVID-19. Medicare providers currently submit some social risk data to CMS through claims, in the form of ICD-10-CM Z codes. However, analysis performed by the CMS OMH Data and Policy Analytics Group (DPAG) group on Z code utilization and reported in a Data Highlight12, shows only limited reporting via this mechanism. In addition, some social risk data is collected through providers participating in the AHC model, and some information will be collected by providers through Post-Acute Care assessment tools. However, this data collection is only for certain patients and does not allow for an analysis across health care settings. The additional information from the MCBS questions will supplement research on the impact housing insecurity has on Medicare beneficiaries and which Medicare populations this is impacting the most. Additionally, the Beneficiary Care Management Program (a QIO initiative) is assisting Medicare beneficiaries with discharge planning which is very challenging when patients are experiencing housing insecurity. Since the MCBS does not currently measure housing insecurity, we do not know the magnitude of the problem to aid in this essential planning.

An ASPE commissioned report titled “Accounting For Social Risk Factors In Medicare Payment13” recognized that social risk factors (including inadequate housing) contribute to health disparities. Furthermore, the IMPACT Act recognized that social risk factors play a major role in health, even requiring additional research related to social risk in Medicare’s value-based payment programs. The social isolation and housing insecurity questions would provide additional data on a large set of Medicare beneficiaries, providing a source for additional research and policy guidance. This subset of items was selected specifically because these two elements are of critical importance to assessing the impact of social risk factors and are represented in the AHC screening tool and in other IMPACT Act-related analysis and research, and are not already collected in some way through the MCBS.

Revise the Health Status and Functioning Questionnaire (HFQ) to Add Item about Social Isolation.

Beginning in Fall 2021 Round 91, the MCBS will add one item to the HFQ as part of the Fall round interview to assess social isolation among Medicare beneficiaries. Distinct from loneliness, social isolation refers to an actual or perceived lack of contact with other people, such as living alone or residing in a remote area. Social isolation tends to increase with age, is a risk factor for physical and mental illness, and is also a predictor of mortality. This measure is currently part of the Accountable Health Communities (AHC) Health-Related Social Needs Screening Tool in the Family and Community Support domain. The AHC question was selected from the Patient-Reported Outcomes Measurement Information System (PROMIS®) question Bank on Emotional Distress. The AHC Screening Tool was developed by a panel of interdisciplinary experts that reviewed evidence-based ways to measure SDOH, such as social isolation14. While the MCBS questionnaire already includes similar items from the AHC Screening Tool related to mental health, substance use, and assistance with activities of daily living and instrumental activities of daily living, adding a new measure on social isolation will improve CMS’ ability to analyze health disparities in Medicare, which is critically important for quality improvement and responsiveness to public health emergencies like COVID-19.

A recent JAMA article15 titled “Social Isolation and Loneliness: Imperatives for Health Care in a Post-COVID World” reports that social determinants (including social isolation), have been found to be responsible for 80%-90% of health outcomes. Due to COVID-19, there is an increase in housing instability and homelessness related to difficulty paying rent and mortgages, and increased risk of adverse COVID-19 effects related to housing instability and homelessness. The pandemic has also led to unprecedented social distancing, quarantines and isolation procedures that have had a profound effect on social isolation. These factors disproportionately impact minority populations and further exacerbates underlying health disparities. Eliminating racial and ethnic disparities is one of the foundational principles of the legacy CMS Quality Strategy, and the CMS Meaningful Measures framework establishes eliminating disparities as a cross-cutting criteria to be applied to any area of measurement. Additionally, the Network of Quality Improvement and Innovation Contractors’ 12th Scope Of Work includes a significant focus on behavioral health and discharge planning, which is integrally tied to social determinants around housing instability and homelessness and social isolation, for CMS quality improvement contractors across health care settings.



Closing gaps in data collection and improving our understanding of individuals’ social risk factors can help protect our most vulnerable beneficiaries, especially during the pandemic. It is critically important that CMS improve its understanding of beneficiaries’ social risk factors in order to protect health and ensure beneficiaries get the right care, in the right place, at the right time.

As described in Supporting Statement Part A, with the emergence of the COVID-19 pandemic in the U.S., CMS implemented a number of changes to the MCBS to ensure the health and safety of both respondents and field interviewers while continuing data collection. In March 2020, CMS paused in-person data collection in both community and facility settings. Field interviewers have conducted MCBS interviews by telephone since that time and anticipates at least some phone collection to continue. .

Collection of Physical Measures, including grip strength, assumes that some in-person interviewing resumes in May of 2021 and gradually increases over time. The Physical Measures Questionnaire will only be administered during in-person interviews.

Rounds 89 through 97 Data Collection Procedures
    1. Interviews with incoming panel sample persons in community. In the Fall rounds (Round 91, 94, 97), all newly selected beneficiaries will be mailed a Community Advance Letter from the Centers for Medicare and Medicaid Services (Attachment 2). Advance mail materials have been developed for interviews conducted in person as well as interviews conducted by phone. For in-person data collection, the advance letter explains that an interviewer will be visiting the beneficiary to conduct the survey. For telephone data collection, the advance letter states that an interviewer will call to conduct the survey by phone. If data collection is conducted in-person, field interviewers will carry copies of the advance letter for respondents who do not recall receiving one in the mail, as well as a copy of the MCBS Community Brochure and At the Door Sheet (Attachment 2). Baseline cases receiving COVID-19 questions in their second round (Round 89), and NGACO enrollees receiving COVID-19 only questions, will also receive an advance letter explaining the purpose of the COVID-19 questions (Attachment 2).

The Community interviews (Rounds 89-97) will be administered to the respondent or a designated proxy using a CAPI program on a laptop computer. Attachment 3 includes a copy of all questionnaire sections administered in the baseline interview, the continuing interview, and the Showcards used by the interviewer to assist in the interviewing process.

At the completion of the baseline interview (Rounds 91, 94, 97), each new respondent is provided with a MCBS calendar (Attachment 2), on which he or she is encouraged to record health care events. The same calendar is provided to all Continuing Community respondents on a calendar year basis. When data collection is conducted by phone during the Fall round, the calendar is mailed to respondents.

    1. Interviews with sample persons in institutions. All Facility interviews are administered to facility staff using a CAPI program on a laptop computer. For all facility residents, the Facility Eligibility Screener is administered each time a respondent is found to have entered a facility, or in the case of baseline respondents, is currently in a facility (Attachment 4). The Facility instrument to be used in Rounds 89-97 is shown in Attachment 5.

An advance letter is sent to all facilities each time a respondent is found to have entered a facility, or in the case of baseline respondents, is currently in a facility (Attachment 6). This advance letter has been tailored for interviews conducted in person as well as interviews conducted by phone.

Some facility administrators will require consent of the sample person or a next of kin before releasing any information. The data collection contractor will offer to obtain such written consent, using the Resident Consent Form, and Next of Kin Consent Form. These forms as well as a HIPAA letter are included in Attachment 6.

  1. Proxy rules.

For Community respondents, the preferred mode is self-response. Respondents are asked to designate proxy respondents. These are individuals who are knowledgeable about the respondent’s health care. In the MCBS, only those individuals who are designated by the respondents can serve as proxy respondents.

Upon screening a facility where a facility resident is residing, the interviewers determine the appropriate staff at the facility best able to respond. MCBS interviewers do not interview residents in a facility. Instead, interviewers are trained to determine and seek out the appropriate staff for the interview. When appropriate, interviewers abstract information from available facility records. If a respondent is incarcerated, we do not seek response. Other institutions will be treated on a case-by-case basis.

B3. Methods for Maximizing Response Rates and Dealing with Issues of Non-Response

The sample for the MCBS is a heterogeneous population that presents a unique challenge for maximizing response rates. The survey selects respondents from two Medicare groups—those age 65 and over and those younger than 65 who have disabilities. Both of these groups have characteristics that often lead to refusals on surveys. Increasing age, poor health or poor health of a family member are prevalent reasons for refusal. On the other hand, older persons are the least mobile segment of the population and thus, for a longitudinal survey, less likely to be lost due to failure to locate. Recent data on the MCBS indicate that the population aged under 65 tends to have a slightly higher response rate than the aged population.

Because this is a longitudinal survey, it is essential that we maximize the response rates. In order to do so, data collection staff undertakes an extensive outreach effort each round. This includes the notification of government entities about the survey including CMS regional offices and hotline, carriers and fiscal intermediaries, and Social Security Offices, national organizations including the AARP and various community groups (e.g., social service and health departments, home health agencies, state advocates for the elderly and area agencies on aging). These efforts are undertaken to answer questions or concerns that respondents may have in order to increase the likelihood that respondents would participate in the MCBS and remain in the survey panel.

Further, with the shift to telephone outreach and interviewing in March 2020 due to the COVID-19 pandemic, additional efforts were introduced in Fall 2020 Round 88 to maximize participation among new Incoming Panel members. Whether interviewing in person or by telephone, prefield locating activities (including electronic database searches using LexisNexis® Accurint®) are always run on the Incoming Panel to verify or update addresses and to obtain telephone numbers when available. Based on past experience using electronic database searches, telephone numbers are initially available for just over 50 percent of beneficiaries; using additional electronic database searches has increased initial telephone matches to 92 percent overall. To maximize outreach, two additional advance mailings—a reminder letter and a final reminder postcard (Attachment 2)—were introduced in Fall 2020, along with locating and tracing efforts to increase the availability of phone numbers and maximize response; use of these materials were approved by OMB on June 15, 2020 in a non-substantive change request (ICR reference number 202006-0938-01. An analysis of whether there is a change in the characteristics of the baseline response due to the change in mode administration (comparing Fall 2019 Round 85 in person to Fall 2020 Round 88 telephone collection) will be presented in the forthcoming MCBS 2020 Methodology Report. A similar reminder letter will be used in Winter 2021 Round 89 to maximize outreach among NGACO enrollees who have not completed their COVID-19 interview after three weeks of data collection (Attachment 2).


Specifically, efforts to maximize response rates include: 1) informing authoritative sources to whom respondents are likely to turn if they question the legitimacy of the MCBS; 2) giving interviewers resources to which they can refer to reassure respondents of the legitimacy/importance of the survey; 3) generally making information about MCBS available through senior centers and other networks to which respondents are likely to belong or reach out (such as the 1-800-Medicare hotline); and 4) mailing reminder postcards and letters to respondents to encourage their participation in the survey.

CMS intensively monitors both unconditional and conditional response rates. The unconditional response rate is the percentage of sample that were released during the fall round of the selection year and responded to the survey in a given year. The unconditional response rates, also called cumulative response rates, use the original selected sample size as the baseline in their calculation. Conditional response rates are the percentage of sample that were eligible at the beginning of the fall round of a particular year and responded during that year. Conditional response rates use the sample who are eligible to participate in the survey (a subset of the sample released in the fall round of the selection year) as the baseline in their calculation. In other words, they are conditioned on eligibility. Both indicators are very important for understanding trends about response rates and where interventions should optimally be targeted. These trends are monitored over the full historical span of the survey, providing important insights in changes to response rates over time.

Response is also tracked throughout each round by a host of key indicators including panel, HHS region, age, race, ethnicity, residential status (community or facility), current year Medicare enrollees or not-current year enrollees. In addition, performance by field interviewers is also tracked to identify any staff who need additional training or support to improve their interview completion rates. CMS continually analyzes response rates, particularly for the subpopulations with the lowest propensity to respond, and is fully committed to finding ways to stem declining response rates.

In addition to outreach, the following efforts remain in place to maintain a sense of validity and relevance among the survey participants.

  1. An advance letter is sent to both sampled beneficiaries and facility administrators from CMS with the CMS Survey Director’s signature. This includes an informational brochure answering anticipated questions. A reminder postcard and reminder letter are also sent to encourage response (Attachment 2 and 6).

  2. A handout with Privacy Act information and an appeal to participate is given to the respondent at the door by the interviewer (Attachment 2).

  3. Interviewer training emphasizes techniques and approaches effective in communicating with the older and disabled population and ways to overcome difficulties respondents may have in participating.

  4. Individualized non-response letters are sent to respondents who refuse to participate (example included in Attachment 2). These letters are used when deemed appropriate by the field management staff.

  5. NORC field management staff are specialized to follow up with respondents who express concerns about participating due to privacy or confidentiality questions.

  6. Proxy respondents are sought for respondents unable to participate for themselves in order to keep respondents in the survey over the life of the panel.

  7. Non-respondents are re-contacted by a refusal conversion specialist.

  8. A dedicated project email address ([email protected]) and toll-free number (1-877-389- 3429) is available to answer respondent's questions. This information is contained on various materials provided to the respondent.

  9. An MCBS website (mcbs.norc.org) contains information for respondents on the project. Respondents are also informed about the CMS MCBS Project Page – www.cms.gov/mcbs

  10. Respondents receive an annual MCBS newsletter, which includes information about the survey as well as seasonal topics such as winter safety tips for seniors. Attachment 2 contains an example of a recent newsletter.

  11. Whenever possible, the respondent is paired with the same interviewer throughout the survey. This maintains rapport and establishes continuity of process in the interview.

  12. Interviewers are trained to utilize personal touches such as thank you notes and birthday cards to maintain contact with respondents.

  13. A Community Authority Letter (Attachment 2) is sent to community organizations in advance of the Fall rounds (Rounds 91, 94, 97) to inform community representatives, such as state-level departments of aging, insurance, and state senior Medicare patrol officers, about the MCBS. A Community Authority Letter (Attachment 2) will also be sent via email to NGACO administrators prior to the start of data collection in Winter 2021 Round 89 to explain the purpose of the COVID-19 items and the importance of participation from NGACO enrollees.

A non-response bias analysis for the MCBS was conducted for the first time in 2017 and released as part of the 2015 MCBS Methodology Report16. An updated non-response bias analysis for the MCBS is underway based on the 2018 Panel and will be released in the final 2018 Methodology Report. While non-response is carefully monitored every year, a complete non-response bias analysis is updated every three years to ascertain trends both annually and for subpopulations. The upcoming non-response bias analysis will also include beneficiaries who participated in COVID-19 surveys.

Fall 2015 respondents and non-respondents were compared on various measures, including frame characteristics, Medicare claims payments, and chronic conditions, in order to identify areas of potential bias. The only statistically significant differences were found among frame characteristics. For the 2015 Panel, non-respondents appear more likely to be female and older, and slightly less likely to be non- Hispanic black. Among the continuing panels, however, non-respondents tend to skew younger. None of the differences is large in a practical sense. The weighting procedure includes a raking step that accounts for all of the frame characteristics for which differences were found. Thus, the small potential bias identified via these analyses is expected to be minimized by the weighting procedures. In contrast to most surveys, the MCBS has a large amount of information to characterize non- respondents. This information, including Medicare claims data, can be used for imputation if necessary.

Over the rounds, the following patterns of nonresponse have been observed, which have or have not changed over time. In the most recent three rounds for which a full analysis of response rates have been completed, the round-level response rates for continuing panels remains high, ranging from 80.0% for the 2015 panel in Round 76 to 96.0% for the 2012 panel in Round 75. Despite these high rates, each year continuing panels are subjected to a nonresponse adjustment based on new response propensity models by panel. Incoming panels at the first interview (e.g., the 2015 panel at Round 73) show a larger propensity for nonresponse due to having never been reached prior to the first interview. In Round 76 the response rate for the 2016 Incoming panel was 54.7%. Once again we rely on cells derived from response propensity models to account for differential effects of demographic and geographic characteristics on the resulting data. In 2016 the most closely related covariates to response propensity in the incoming panel were: the mean response rate over the previous 5 years in the same county; entitlement for Part B (2-level: yes, no); age category (7-level: under 45, 45 to 64, 65 to 69, 70 to 74, 75 to 79, 80 to 84, and 85 years or older); and tract-level median household income for households where the householder is at least 65 years of age (4-level: quartiles of median household income in the past 12 months, in 2015 inflation-adjusted dollars). By accounting for these characteristics in constructing the adjustment cells, we reduce the potential for nonresponse bias that could arise due to these differential factors.

Adaptive design methods have also been applied to measure the representativeness of the MCBS incoming sample. In 2017, CMS conducted a review of the Representativity Indicators (R- indicators) or metrics for the Fall 2017 Baseline interview to monitor the representativeness of the achieved sample. The R-indicators provided a quantitative assessment of which segments of the sample were over/under producing and causing the achieved sample to be imbalanced in terms of sample representativeness.

A sample R-indictor as well as two partial R-indicators (variable and category) are used to monitor representativeness of the panel. The variable R-indicator measures the representativeness of the sample associated with each variable (looking at the strength of each co-variate subpopulation such as race, ethnicity, age, sex, region) to predict response propensity. The category R-indicator then looks at the categories of each variable to measure representativeness of the responding sample.

In Fall 2016 and Fall 2017, R-indicators were not observed outside these thresholds; consequently, no data collection interventions were needed to improve the representativeness of the achieved sample. Use of R-indicators, along with a continual review of annual and historical response rates and non-response bias analysis are important tools in understanding response and ensuring that the sample as a whole, as well as subpopulations, are represented to produce high quality data. Future analysis will also focus on the R-indicators found in in-person data collection as compared to telephone data collection for the baseline sample.

Analysis of Shift to Phone Survey Administration

With the emergence of the COVID-19 pandemic in the U.S., CMS paused in-person data collection in both community and facility settings to ensure the health and safety of interviewers and respondents. While MCBS interviewers have always had the flexibility to conduct an interview by phone, rarely were phone interviews conducted due to the complexity of collecting detailed cost and utilization data that rely on the review of statements. However, since late March 2020, field interviewers have conducted MCBS interviews only by telephone. CMS continues to monitor potential impacts on survey operations and data quality via comprehensive analyses of paradata and response patterns from data collected via phone in 2020.

Impact to Survey Operations

It is especially challenging to disentangle the impact of the pandemic while analyzing the impact of a change in mode. In Winter 2020 Round 86, which was conducted from January 10, 2020 through April 26, 2020, data collection was scheduled to be conducted in person. However, on March 14, all in-person Facility data collection was paused and on March 18, all in-person Community interviewing was paused. There was no Facility data collection for two days and no Community data collection for five days; phone data collection started on March 16 in facilities and on March 24 in community settings with only a small number of field interviewers. As cases were completed, additional interviewers were trained on conducting the phone interviews. At the end of the round, only 65% of the interviewers had returned to work and conducted phone interviews. Some interviewers chose not to work as the impact of the pandemic worsened; others did not want to conduct phone interviews and preferred to work only on in-person data collection.

As shown in Table B.6, the overall response rate in Winter Round 86 was approximately 8.6 percentage points lower than the response rate of the previous Winter Round (Round 83) and 8.5 percentage points lower than the average response rate of the previous two winter rounds. Thus, while the MCBS was able to shift to phone in a short period of time, the impact of the pandemic, a slowdown in data collection, and the abrupt shift to 100% phone collection impacted final response rates for that round. Following standard MCBS protocol, many cases that did not complete a Round 86 interview but had completed a Fall 2019 Round 85 interview were fielded again in Summer 2020 Round 87, thus reducing the impact of the drop in response rate.

Moving to Summer 2020 Round 87, 100% phone data collection was conducted and the overall response rate was approximately 2.5 percentage points higher than the response rate of the previous Summer Round (Round 84) and 1.8 percentage points higher than the average response rates of the previous two summer rounds. Thus, at least one metric – response rate – showed that phone data collection would not negatively impact MCBS.

Table B.6: Summary of Response Rates by Round

Round

Type

2020
Panel

2019
Panel

2018
Panel

All
Panels

Continuing
Panels

80

Winter




86.6

91.4

81

Summer




86.3

89.3

82

Fall



55.9

70.7

87.6

83

Winter



81.1

86.7

90.9

84

Summer



82.1

85.0

87.3

85

Fall


55.1

84.7

71.2

88.4

86

Winter


73.7

74.8

78.1

81.0

87

Summer


83.5

89.3

87.5

90.8

88

Fall

41.6

83.9

88.9

59.4

87.4



Finally, looking at the response rates for the Fall 2020 Round 88, we see a mixed result. While the overall response rate in Fall Round 88 was approximately 11.7 percentage points lower than the response rate of the previous Fall Round (Round 85) and 11.5 percentage points lower than the average response rates of the previous two fall rounds, most of this difference in Fall Round response rate is associated with the baseline panel, which was expected. Conducting phone outreach and data collection for a baseline panel (e.g., the 2020 Incoming Panel) had never been attempted before. Phone numbers were not available from the Medicare enrollment data base and therefore, new methods had to be established to connect with sampled beneficiaries.



Because there was no historical data for phone data collection with the baseline panel, CMS and NORC determined that the release of a buffer sample would be most prudent in order to achieve a target number of 5,873 2020 Panel completed cases. The release of the buffer sample added more than 4,000 cases to the sample, which we knew would negatively impact the response rate.

In Round 88, the response rate for the new 2020 Panel was approximately 13.5 percentage points lower than the previous baseline panel response rate for the 2019 Panel in Round 85, and approximately 13.9 percentage points lower than the average response rate of the last two baseline panels (the 2018 and 2019 panels in Rounds 82 and 85, respectively).

In Round 88, the response rate for the other three Continuing Panels (2017, 2018 and 2019 Panels) was approximately one percentage point lower than the Continuing Panel response rate in Round 85, and less than one percentage point lower than the average Continuing Panel response rates of the previous two fall rounds. Due to the round occurring during the November election period, a slight drop in response rates was anticipated. These response rates are based on final in-round field reports. Final response rates are published in the annual Methodology Report.

Impact to Data Quality

A robust analysis of paradata and response patterns demonstrates stability of the representativeness and quality of MCBS data collected via phone. Although the 2020 Panel had a lower response rate than previous baseline panels, internal calculations of representativity indicators (R-Indicators), which measure variability in estimated response propensities based on key population characteristics, indicate that data collection in the 2020 panel was balanced and representative of the true underlying population in terms of demographic characteristics of race, sex, age, Hispanicity, and HHS geographic region. This suggests that representativeness of the achieved 2020 Panel is on par with past MCBS panels in their first round.

An in-depth analysis of changes in response patterns between MCBS data collected during 2020 via telephone interviews compared with data collected prior to the pandemic via in-person interviews revealed limited evidence of data quality problems with phone administration. This analysis, which spanned all three data collection rounds of 2020 and included nearly 500 questionnaire variables and paradata variables from the Community and Facility Interviews, used a model-based approach to assess the stability of trends in responses from 2016 through 2019 and then assessed the degree to which the 2020 data maintained or broke those trends. Relatively large decreases in healthcare utilization and cost reporting were observed, particularly in the Community interviews in Rounds 86 and 88 and Facility interviews in Round 87, but these are likely due in large part to actual decreases in utilization during the pandemic and are consistent with findings from other analyses conducted about health care utilization among Medicare beneficiaries in 2020 using Fee for Service claims data17. There were some indications of difficulties collecting data requiring physical access to documentation such as healthcare statements and prescription medicine, which were partly due to measures taken to reduce respondent burden. Overall, few questionnaire sections showed substantial shifts in response patterns or increases in item-level nonresponse in 2020.

B4. Tests of Procedures or Methods

MCBS’ generic clearance for Questionnaire Testing and Methodological Research for the MCBS was approved by OMB in May 2015 and received approval for an extension without change on May 18, 2018 (OMB No. 0938-1275, expiration 05/31/2021). The generic clearance encompasses development and testing of MCBS questionnaires, instrumentation, and methodological experiments. It contains approval for seven types of potential research activities:

1) cognitive interviewing, 2) focus groups, 3) usability testing, 4) field testing, 5) respondent debriefing questionnaire, 6) split ballot and other methodological experiments, and 7) research about incentives. Any future changes to the MCBS instrumentation, data collection methods, or procedures that require testing will be submitted as individual collection requests under the generic clearance.

On May 7, 2020, OMB approved CMS-10549 GenIC#7 MCBS COVID-19 Rapid Response Supplement Testing under the MCBS Generic Clearance (0938-1275). The field test was conducted with MCBS respondents living in the Community from June 10 to July 15, 2020. The data were collected in parallel with the MCBS Summer 2020 Round 87 production. Testing the questions and methodology provided meaningful information for the MCBS COVID-19 Supplement. It demonstrated that the questions worked as intended and that the flow and administration by phone was smooth. It also showed that conducting a standalone supplemental interview simultaneous to main MCBS data collection would be successful, both in terms of response to the supplement and not causing harm to main MCBS data collection. MCBS respondents willingly participated in the 15-minute survey. The total sample size for the Summer 2020 COVID-19 Supplement Test was 14,332 cases. Of these, 11,114 sampled beneficiaries were interviewed during the testing period, far exceeding the target of 8,000 completed interviews; the calculated overall response rate was 78.9 percent, calculated using the guidelines specified in the American Association for Public Opinion Research (AAPOR) and OMB. CMS released data from the supplement in a special public use file in October 202018. These data will also be provided to users as part of a MCBS 2019 Limited Data Set scheduled for release in summer 2021. Results from the field test are also included in Attachment 8.

Based on the initial success of the Summer 2020 COVID-19 Supplement, CMS then requested OMB approval to continue COVID-19 Supplement collection in Fall 2020 Round 88. The Fall 2020 request expanded the data collection to MCBS respondents living in long term care facilities. Some terminology and questions were also changed in the Fall 2020 COVID-19 Supplement to align with other Federal surveys or to meet additional needs of CMS and CDC collaborators. Please refer to Attachment 9 for a full crosswalk of changes documenting the differences from the Summer 2020 COVID-19 Supplement to the Fall 2020 COVID-19 Supplement. Through an emergency clearance, OMB approved CMS’ request on August 7, 2020 (0938-1379, expiration 02/28/2021). Collection of the Fall COVID-19 Supplement began October 5, 2020 and ended on December 31, 2020. CMS released data from the Fall 2020 Fall 2020 Community COVID-19 Supplement in a special PUF in January 202019. Both the community and facility data will also be released as part of a MCBS 2020 Limited Data Set scheduled for release in summer 2022.


This request revises the main MCBS clearance (0938-0568) to add the COVID-19 questions to the MCBS beginning in Winter 2021. The COVID-19 questions are the same as those approved by OMB on August 7, 2020 for administration in Fall 2020 Round 88. These items will be fielded each round as long as it is relevant to do so.

B5. Individuals Consulted on Statistical Aspects of Design

The person responsible for statistical aspects of design is:

Edward Mulrow, Ph.D. Vice President

NORC at the University of Chicago

4350 East-West Highway, 8th Floor

Bethesda, MD 20814

(301) 634-9441

[email protected]

The contractor collecting the information is NORC at the University of Chicago.

1 Note that the historical target of 11,500 responding beneficiaries across all panels was not achievable in 2019; the target was reduced to 9,996, which was the maximum number of completed interviews achievable within budget.

2 Note that prior to 2017, 107 PSUs were used for sampling for the MCBS. These included three PSUs in Puerto Rico. Beginning in 2017, Puerto Rico was removed from the MCBS sampling frame.

3 Beginning in 2017, the 18 SSUs selected from the three Puerto Rico PSUs were removed from the sampling frame, leaving 685 SSUs for sampling for the MCBS.

4 For example, persons who became eligible for Medicare during 2015 could have incurred health care costs in 2015. By including such persons in the sampling process up to a year earlier than was done previously, they can be appropriately represented in the 2015 Cost Supplement File up to a year earlier.


5 Events and costs incurred after enrollment in Medicare but prior to the first interview.

6 The COVID-19 Questionnaire will be administered each round as long as it is relevant to do so, given the trajectory of the pandemic.

7 The COVID-19 Questionnaire will be administered each round as long as it is relevant to do so, given the trajectory of the pandemic.

8 Available from: https://www2.census.gov/programs-surveys/demo/technical-documentation/hhp/Phase3_Questionnaire_01_06_21_English.pdf

10 Additional information available from: https://www.cdc.gov/nhsn/pdfs/covid19/ltcf/cms-covid19-req-508.pdf

11 More information about the AHC Screening Tool is available on the model webpage at https://innovation.cms.gov/Files/worksheets/ahcm-screeningtool.pdf.

12 Available from: https://www.cms.gov/files/document/cms-omh-january2020-zcode-data-highlightpdf.pdf

13 Available from: https://www.nap.edu/catalog/23635/accounting-for-social-risk-factors-in-medicare-payment

14 More information about the AHC Screening Tool is available on the model webpage at https://innovation.cms.gov/Files/worksheets/ahcm-screeningtool.pdf.

15 Available from: https://jamanetwork.com/channels/health-forum/fullarticle/2774708

17 For more information, please refer to: https://aspe.hhs.gov/system/files/pdf/264071/Medicare-FFS-Spending-Utilization.pdf

18 Available from: https://www.cms.gov/Research-Statistics-Data-and-Systems/Downloadable-Public-Use-Files/MCBS-Public-Use-File

19 Available from: https://www.cms.gov/Research-Statistics-Data-and-Systems/Downloadable-Public-Use-Files/MCBS-Public-Use-File

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleCMS-P-0015A (0935-0568) Supporting Statement B
SubjectSupporting Statement B
AuthorCMS
File Modified0000-00-00
File Created2021-05-06

© 2024 OMB.report | Privacy Policy