CMS-P-0015A (0938-0568) Supporting Statement B - 6-22-16 (clean)

CMS-P-0015A (0938-0568) Supporting Statement B - 6-22-16 (clean).docx

Medicare Current Beneficiary Survey (MCBS)

OMB: 0938-0568

Document [docx]
Download: docx | pdf


Supporting Statement B

for Revision of Currently Approved Collection: Medicare Current Beneficiary Survey (MCBS)





Contact Information: William S. Long

Contracting Officer’s Representative, Medicare Current Beneficiary Survey Office of Enterprise Data and Analytics (OEDA)/CMS

7500 Security Boulevard, Mail Stop Mailstop B2-04-12 Baltimore, MD 21244

(410) 786-7927

[email protected] (410) 786-5515 (fax)




April 13, 2016

Table of Contents

B. STATISTICAL METHODS 4

B1. Universe and Respondent Selection 4

B2. Procedures for Collecting Information 7

B3. Methods for Maximizing Response Rates and Deal with Issues of Non-Response 17

B4. Tests of Procedures or Methods 18

B5. Individuals Consulted on Statistical Aspects of Design 18

LIST OF ATTACHMENTS



Attachment 1: 60-day Federal Register Notice Attachment 2: Community Advance Letter – English

MCBS Community Brochure – English At the Door Sheet – English

MCBS Calendar – English

Income and Assets (IAQ) Brochure – English Community Authority Letter

CMS Thank You Letter (Community) – English MCBS Respondent Newsletter

Attachment 3: Representation of the Community continuous core instrument for Rounds 76-83 Attachment 4: Community Instrument (Supplemental and Continuing) and Showcards Attachment 5: Facility Eligibility Screener

Attachment 6: Facility Instrument (Baseline and Core) and Showcards Attachment 7: Facility Advance Letter - English

MCBS Facility Brochure – English Resident Consent Form

Next of Kin Consent Form HIPAA Letter – English

B. STATISTICAL METHODS

The revision to this OMB package includes the following modifications to the sampling design and Community questionnaire sections:

    • Modify the sample design permanently by increasing the sample size to achieve additional Hispanic completed cases (beginning in fall 2016), as well as additional dual-eligible (Medicare/Medicaid) completed cases (beginning in 2017).

    • Addition of a new section on Use of Nicotine and Alcohol (NAQ).

    • Updating the Health Status and Functioning (HFQ) section.

    • Reducing the number of items asked in the Patient Perceptions of Integrated Care /Usual Source of Care (PPIC/USQ) section.

    • Adding a question to the Demographics and Income (DIQ) section that measures English literacy among respondents with Limited English Proficiency.


B1. Universe and Respondent Selection

The target universe is current Medicare beneficiaries entitled to hospital and/or supplementary medical insurance, living in the 50 states, the District of Columbia, and Puerto Rico. Both institutionalized and non-institutionalized beneficiaries are represented. Table B.1 summarizes the number of beneficiaries in the target universe based on CMS administrative records through 2015 and projected estimates for 2016 and 2017. The seven age groups shown in the table correspond to the primary sampling strata from which the samples for the MCBS are drawn. The age groups are defined by the beneficiaries’ age as of July 1 of the given year.

Table B.1: Universe Counts Broken Down by MCBS Sampling Strata (in thousands)


Shape2

Table B.1: Universe Counts Broken Down by MCBS Age Groups (in thousands)

Age Interval


2012


2013


2014


2015


2016 (est.)


2017 (est.)

Disabled 0 to 44


1,902.30


1,754.70


2,081.98


1,938.78


1,964.72


1,991.02

45 to 64

6,506.78

6,291.14

7,147.45

7,207.86

7,475.57

7,753.23

Total

8,409.42

8,045.84

9,229.42

9,146.64

9,440.30

9,744.24

Aged







65 to 69

12,588.42

10,893.58

13,541.48

15,312.60

16,533.66

17,852.10

70-74

10,091.88

11,010.72

10,973.99

11,640.90

12,217.06

12,821.74

75-79

7,434.62

8,024.64

7,890.82

8,314.00

8,636.35

8,971.19

80-84

5,701.80

5,989.26

5,767.31

5,999.42

6,106.62

6,215.73

85+

6,169.66

7,228.14

6,626.77

7,045.62

7,401.59

7,775.54

Total

41,986.38

43,146.34

44,800.37

48,312.54

50,895.27

53,636.29


Total


50,395.46


51,192.18


54,029.80


57,459.18


60,335.57


63,380.53

Source: Historical counts 2012-2014 are based on CMS administrative records. Historical counts of 2015 are from twenty times 2015 HISKEW sizes. Projections (2016, 2017) from the historical counts are based on the average annual rate of change from 2012-2015. Totals do not necessarily equal the sum of rounded components.

Shape3


The target sample size of the MCBS has traditionally been designed to yield 11,500 completed cases providing Cost and Use data per year (approximately 1,000 disabled enrollees under the age of 65 in each of two age strata, and 1,900 enrollees in each of five age strata for enrollees 65 or older). This clearance request modifies the sample design permanently by increasing the sample size to achieve 75 additional Hispanic completes added to each Fall round target (2016, 2017, 2018) as well as adding up to 200 additional dual-eligible (Medicare/Medicaid) completed

cases beginning in 2017 (Round 79) and every year thereafter. These small oversamples will improve analyses of these small population groups. This change will allow for improved precision of estimates in the analysis of Medicare cost and use for these two population groups – Hispanics and dual eligible beneficiaries.

To achieve the desired number of completed cases, the MCBS selects new sample persons each year to compensate for nonresponse, attrition, and retirement of sample people in the oldest panel, and to include the newly eligible population, while continuing to interview the non-retired portion of the continuing sample.

Through 2009, the MCBS had generally added approximately 6,500 beneficiaries to the sample in the September - December round each year to replace the existing panel and to offset sample losses due to non-response and attrition. This number can increase or decrease depending on available resources and the extent of non-response in the previous rounds. For example, beginning in the fall round of 2010, the number of beneficiaries included in the sample was increased to approximately 7,400 to compensate for declining response rates. By 2015, the sample has increased further to approximately 8,500, with an additional 123 additional cases to support Hispanic oversampling,Approximately 2,400 sample persons in the oldest panel are retired from the study in the May - August round each year, but this number also varies from year to year. As a result, the sample size averages approximately 14,700 interviews per round, yielding up to 11,500 cases with completed annual utilization and expenditure information.

Sample persons who refuse one or more rounds or who cannot be located for one of the scheduled interviews are not counted as completed cases. On the other hand, proxy interviews are attempted for deceased sample persons. If data are collected through the date of death, then such cases are counted as completes. For sample persons who reside in both a community and a facility setting, the round is considered complete if both community and facility interviews are completed. Sample persons remain in the survey when they are unavailable for an interview in a given round; that is, they are carried forward into the next round. For these individuals the reference period for their next core interview covers the period since their last interview so that there will not be a gap in coverage of utilization and expenditure data. Modules are round- specific and only administered once a year. If a sample person is unavailable for two rounds in a row, they are not scheduled for any further follow-up because extension of the recall period beyond eight months is not feasible. Such cases are treated as nonresponding cases.

A broad range of statistics are produced from the MCBS. Robustness and generalizability have been stressed in sample design rather than customizing for specific goals. However, recent changes to the sampling methodology allow for analysis of additional groups of beneficiaries. The MCBS will continue to over-sample the extreme elderly and the disabled beneficiaries.

Beginning in 2015, the Hispanic population are also oversampled with a goal of obtaining an additional 75 completed interviews per year from Hispanics residing outside of Puerto Rico. . This clearance proposes that in the Fall of 2016 (Round 76) and every year thereafter, this oversample continues, thus permanently increasing by 75 completes the number of Hispanics represented in the data. Also beginning in 2015, current-year enrollees are included in the sampling frame of beneficiaries from which the new panel sample is selected1. And beginning in 2017, 200




Shape4

1 Under the historical MCBS system, current-year enrollees would not be sampled until the year following enrollment.

additional dual-eligible (Medicare/Medicaid enrollees) beneficiaries are planned to be added to the Fall round sample targets2.

The methodology for drawing the samples is described later in this document. The number of cases to be selected each year (designated sample sizes) are larger than the targeted number of completes to compensate for non-response, ineligibility, and attrition. To see an illustration of the extent of the compensation necessary in Round 73 to achieve the desired number of cases providing annual data, see Table B.2.

Table B.2: Sample Size Needed to Compensate for Initial Non-Response and Ineligibility in the 2015 Fall Round


Table B.2: Sample Size Needed to Compensate for Initial Non-Response and Ineligibility

Age on July 1 of

Desired average number of

Number sampled at

reference year

cases providing annual data

Round 73*

0-44

334

845

45-64

334

513

65-69

630

1,304

70-74

630

1,337

75-79

630

1,468

80-84

630

1,475

85+

630

1,562

Total

3,818

8,504

*Excluding Hispanic oversampling cases


Cross-sectional sample sizes for other domains. There are multiple domains of interest in the MCBS, (for example, respondents with end-stage renal disease, persons residing in nursing homes, managed care enrollees, beneficiaries of various race and ethnic backgrounds, and Medicaid recipients). The MCBS will continue to maintain a minimum target of 12,000 completed responses annually3 to help ensure that analysis can be performed on MCBS data for many domains of interest.

Sample sizes for longitudinal analyses. Under the rotating panel design specified for the MCBS, respondents remain in the sample for up to twelve rounds of data collection over a four year time period. The historical response rates and attrition rates observed in the MCBS are used to determine the rotational sample size and configuration of each new incoming panel. The rotational sample design attempts to achieve consistency in subgroup sample sizes across all panels comprising a particular calendar year.

Table B.3 presents the round-by-round conditional and cumulative response rates as of Round 70 (fall round of 2014) for the samples (referred to in the table as “panels”) selected in 2007 through



Shape5

22 For purposes of computing respondent burden, we are assuming that the dual eligible oversample will start in 2017 and continue each year thereafter.

3 This includes completed cases from oversampled populations, including Hispanics and dual-eligibles.

2014. For example, from the bottom part of the table, it can be seen that by the 10th round of data collection for the 2011 panel, 44.2 percent of the 2011 panel were still in a formal responding status (that is, either the sample person was alive and still participating in the study or had died but a cooperative proxy was found for the collection of data on the last months of life) or had participated in the survey until death, leaving enough data to estimate the last months of life. For the 2012 and 2013 panels, the cumulative response rates as of Round 70 were 44.0 percent (through the 7th round of data collection) and 50.5 percent (through the 4rd round of data collection), respectively. The 2014 panel (the new panel selected in Round 70) had an initial response rate of 58.4 percent in its first round of data collection.

Round 70 (Fall 2014) is the latest round for which MCBS data have been processed. There were 2,444 interviews successfully completed at Round 70 with still-living members of the 2011 panel. For brevity, we refer to these 2,444 interviews as “live completes”. For the 2012 and 2013 panels there were 2,541 and 3,057 live Round 70 completes, respectively. For the first round of data collection for the 2014 panel, there were 6,359 completes at Round 70.

The MCBS has used a variety of techniques to maintain respondents in the survey and reduce attrition. These will be continued and adapted to comply with the time frames for initiating and implementing the continuous sample.

B2. Procedures for Collecting Information


This section describes the procedures used to select the samples for the national survey. It includes a general discussion of the statistical methodology for stratification and rotational panel selection, estimation procedures, and the degree of accuracy needed. This is followed by a presentation of how topical modules are used to enhance the analytic potential of the MCBS data. The content of the continuous or core questionnaires is then summarized. Finally, there is a discussion of rules for allowing proxy response.


  1. Statistical Methodology for Stratification and Sample Selection

This section opens with a description of the MCBS sample design. This is followed by a general discussion of the selection of the original and supplemental samples, and the use of a 5-percent sample of the Health Insurance Master File (HIM), also referred to as a 5-percent HISKEW sample each year to reduce problems associated with duplication of samples across the years.

    1. PSU and Census tract clustering. The MCBS employs a complex multistage probability sample design. At the first stage of selection, the sample consists of 107 primary sampling units (PSUs) defined to be metropolitan areas and clusters of nonmetropolitan counties. At the second stage of selection, samples of Census tracts are selected within the sampled PSUs. At the third and final stage of selection, stratified samples of beneficiaries within the selected Census tracts are sampled at rates that depend on age group, race/ethnicity, and Puerto Rican residency.

The strata used for selection of the PSUs cover the 50 states, the District of Columbia and Puerto Rico. Since PSUs were selected randomly with probabilities proportionate to size,

there are some states without any sample PSUs within their boundaries. Within major strata defined by region and metropolitan status, PSUs were sorted by percent of beneficiaries enrolled in HMOs and/or percent of beneficiaries who are minorities based on data in CMS administrative files) and substrata of roughly equal size were created from the ordered list for sample selection.

Table B.3: Conditional Response Rates as of Round 70 for Medicare Current Beneficiary Survey by Interview Round


Shape6

Table B.3: Conditional Response Rates as of Round 70 for Medicare Current Beneficiary Survey by Interview


Round



2007

2008

2009

2010

2011

2012

2013

2014


Panel

Panel

Panel

Panel

Panel

Panel

Panel

Panel


(n=6680)

(n=5532)

(n=6915)

(n=7260)

(n=7365)

(n=7400)

(n=7400)

(n=11625)

Round 1

80.3%

78.0%

77.5%

77.5%

77.4%

73.1%

72.7%

58.4%*

Round 2

90.5%

90.7%

89.8%

89.5%

89.0%

88.2%

86.6%


Round 3

93.7%

94.3%

92.7%

94.0%

92.2%

92.9%

92.6%


Round 4

94.9%

95.5%

94.9%

94.3%

92.8%

93.2%

86.5%


Round 5

96.7%

96.8%

96.1%

95.9%

94.8%

94.9%



Round 6

97.6%

97.7%

97.3%

95.5%

96.1%

95.0%



Round 7

97.6%

97.7%

97.2%

95.0%

95.7%

87.6%



Round 8

97.9%

97.9%

97.8%

96.7%

96.4%




Round 9

98.4%

98.3%

97.6%

97.5%

96.7%




Round 10

98.7%

98.8%

97.5%

97.7%

92.3%




Round 11

99.5%

99.3%

99.2%

98.8%





Round 12

99.9%

99.8%

99.6%

99.6%





Cumulative Response Rate for Medicare Current Beneficiary Survey by Interview Round


2007

2008

2009

2010

2011

2012

2013

2014


Panel

Panel

Panel

Panel

Panel

Panel

Panel

Panel


(n=6680)

(n=5532)

(n=6915)

(n=7260)

(n=7365)

(n=7400)

(n=7400)

(n=11625)

Round 1

80.3%

78.0%

77.5%

77.5%

77.4%

73.1%

72.7%

58.4%*

Round 2

72.7%

70.7%

69.6%

69.4%

68.9%

64.5%

63.0%


Round 3

68.1%

66.7%

64.5%

65.2%

63.5%

59.9%

58.3%


Round 4

64.6%

63.7%

61.2%

61.5%

59.0%

55.8%

50.5%


Round 5

62.5%

61.7%

58.8%

58.9%

55.9%

52.9%



Round 6

61.0%

60.3%

57.2%

56.3%

53.7%

50.3%



Round 7

59.6%

58.9%

55.6%

53.5%

51.4%

44.0%



Round 8

58.3%

57.7%

54.4%

51.8%

49.6%




Round 9

57.4%

56.7%

53.1%

50.5%

47.9%




Round 10

56.6%

56.0%

51.8%

49.3%

44.2%




Round 11

56.4%

55.6%

51.4%

48.7%





Round 12

56.3%

55.5%

51.2%

48.5%





*The 2014 panel response rate was impacted by several operational design changes recognized during the

transition between contractors in 2014, including an extensive CAPI instrument development effort originally considered out-of-scope for transition purposes, the initial need to release a larger 2014

supplemental sample to account for a smaller continuing sample fielded in the fall of 2014, the hiring and training of 100 new interviewers o the MCBS data collection, and the decision to extend the supplemental panel data collection through the release of additional replicates in December 2014, resulting in shorter data collection period and consequently lower response rate for 2,500 sample members.


Within the PSUs, a sample of 703 second-stage units (SSUs) consisting of Census tracts or clusters of adjacent tracts was selected. There were several steps in the SSU sampling process. First, an extract of the entire Medicare Enrollment Database (EDB) was obtained, and all beneficiaries’ addresses were geocoded to the tract level. A minimum measure of size was used to determine whether a Census tract was large enough (i.e., had enough Medicare beneficiaries) to stand on its own as an SSU or would need to be combined with one or more adjacent tracts. A frame of 24,212 SSUs was then constructed, and a sample of 703 SSUs was selected using systematic probability proportional to size. An additional sample of 339 reserve SSUs was also selected to support an expansion of the sample or the study of special rare populations in future years. To date, these SSUs have not yet been used for sampling for the MCBS.

    1. Selection of beneficiaries. At the inception of the MCBS, an initial sample of over 15,000 beneficiaries was selected from the 5-percent sample of the Health Insurance Master File (HIM), also referred to as a 5-percent HISKEW. This sample was clustered within the selected PSUs and ZIP fragments (used to construct SSUs at that time) and was designed to achieve uniform sampling weights within each of the seven age domains at the national level. Beginning in Round 10, with the transition to a rotating panel design, samples of approximately 6,450 beneficiaries (eligible on January 1 of each year) have been selected from a 5-percent HISKEW each year. Beginning in 2015, beneficiaries eligible anytime during the sampling year are also included in the HISKEW sampling frame. Nursing home residents are drawn into the sample in exactly the same manner as other beneficiaries residing in the community.


Each year, a new supplementary sample (referred to as a panel) is selected for the MCBS. To determine the appropriate sample sizes for the new panel, the MCBS sample sizes achieved in the prior year are reviewed in April of each year. New projections are made of the sample size necessary to obtain the targeted number of responding cases in subsequent Cost and Use data releases. For example, it was projected that roughly 8,504 sample beneficiaries, plus 123 additional Hispanic beneficiaries, would be needed for the 2015 panel (the latest panel selected for the MCBS) in order to meet sample size and oversample goals. Looking to the future, it is projected that 8,504 sample beneficiaries, plus an additional 123 beneficiaries to support Hispanic oversampling, will be needed for the 2016 panel, and that a total of 8,955 sample beneficiaries will be selected annually for 2017, 2018, and 2019 (8,504 sample beneficiaries plus an additional 123 beneficiaries to support Hispanic oversampling, and 328to support dual-eligible oversampling) if the goal is to ensure that the Cost and Use data products are based on 12,000 completed cases.

  1. Estimation Procedure

To date, sampling weights have been calculated for each Fall round (1, 4, 7…, and 70) in order to produce the Access to Care Series. Both cross-sectional and longitudinal weights have been calculated. These weights reflect differential probabilities of selection and were adjusted to account for overlapping coverage of the panels included in the Access to Care data file and non- response. Replicate weights were also calculated so that users can calculate standard errors using replication methods. In addition to the replicate weights, stratum and unit codes exist on each weight file for users who prefer to use Taylor Series methods to estimate variances.

Besides standard weighting and replicate weighting, another part of the estimation program includes the full imputation of the data sets to compensate for item non-response. Imputation of charges for non-covered services and sources of payment for covered services in the Cost and Use annual file have been developed. The weighting and imputation of data will continue.

  1. Degree of accuracy needed for the purpose described in the justification

A broad range of statistics will be produced from the MCBS. There is no single attribute of beneficiaries and their medical expenses that stands out as the primary goal of the survey. Thus, there can be no simple criterion for the degree of reliability that statistics for each analytic domain should satisfy. Even with a sample size of 14,000 to 15,000 persons, there will be many small domains of interest for which it will be necessary to use modeling techniques or to wait several years for sufficient data to accumulate. Examples include people with specific medical conditions (e.g., hip fractures), institutionalized persons under age 65, and sample persons experiencing spend down.

The MCBS will maintain a stratified approach to the selection of the sample. The sample will continue to be clustered by PSU and Census tract-based SSU and stratified by age domain, race/ethnicity, and Puerto Rican residency; the tract-based SSU approach was an innovation first begun in 2014 which has resulted in greater efficiencies and increased analytic opportunities. We anticipate maintaining a total of 2,000 annual cases allocated to the two younger age categories for disabled beneficiaries who are not yet 65. The two age categories were selected because they indirectly reflect the means by which the disabled person becomes eligible for Medicare. Since the number of disabled sample persons per PSU and Census tract will be small, the effects of clustering on statistical precision should be mild for this subgroup. Thus, with an effective sample size of 1,000 or more for each age stratum, accuracy for each of the two age strata should not be much different from that commonly attained in public opinion surveys. For example, depending on the prevalence of the characteristic being estimated, the MCBS has achieved standard errors for estimates of percentages ranging from 2-3% or lower for subgroup estimates based on 1,000 respondents. Since many of the cost and reimbursement statistics derived from the MCBS may be heavily right-skewed (i.e., reflecting the higher end of the cost/reimbursement spectrum to a disproportionate degree), the accuracy may be lower in relative terms but still acceptable. For example, the relative standard error of the mean total Medicare reimbursements derived from the MCBS has generally ranged from 2.0-2.5% for the total sample, and 4.0-8.0% for subgroups.

Each of the age strata for the aged Medicare sample will be allocated 1,900 cases, with the oldest stratum (age 85 and over) being allocated 2,000 cases. A major reason for over sampling the very old is to obtain an adequate sample of nursing home stays. Variations in sampling weights across the age strata and clustering within PSU and Census tract will inflate sampling errors, but the resulting effective sample sizes should be adequate for most analyses.

  1. Interview content for periodic data collection cycles to reduce burden.

1) Content and timing of the core interview. The primary variables of interest for the MCBS are the use and cost of health care services and associated sources and amounts of payment. While Medicare claims files supply information on billed amounts and Medicare payments for covered services, the survey provides information on use of services not covered by Medicare and on payment sources and amounts for costs not reimbursed by Medicare. For both the Community and Facility core components, the primary focus of the data collection is on use of services (dental, hospital, physician, medical providers, prescription medication and other medical services), sources and amounts of payment, and health insurance coverage. The “core” MCBS interview collects continuous information on these items through thrice-yearly interviews. The Community component also contains summary components, which update the household enumeration, health insurance status and follow-up on cost and sources of payment information for “open items” from the previous interview.

Continuous data on utilization and expenditures are required for a number of reasons. First, several of the distinct expenditure categories involve relatively rare medical events (inpatient hospital stays, use of home health care, purchase of durable medical equipment), so limiting the reference period would mean insufficient observations for national estimates. Second, episodes of medical care often consist of a series of services over weeks or months; continuous data will allow examination of the grouping of services around particular episodes of care. This is particularly important when a number of medical services are included in a global fee. Third, payment for medical services often occurs considerably later than the utilization, so collection of complete information about a particular event can often only be obtained sometime after the event occurs. In addition, this emphasis on utilization and expenditures will formulate an excellent baseline to monitor both Medicare reform and CMS’ program management effectiveness.

The administration of the instruments will continue to follow the established pattern of data collection, i.e., baseline information will be collected in the initial interview with new panel respondents; this first interview is also referred to as the Supplemental interview. This will be followed in all subsequent interviews with the core component. The core Community and Facility components are administered in the second interview (Winter round) to maintain a consistent reporting period for utilization and expenditure data. Since the initial interview always occurs in the last four months of a calendar year, collection of utilization and expenditure data in the second interview means the reference period will always begin prior to January 1st. This creates use and expenditure estimates on a calendar year basis.

Modules that collect baseline information such as access to care, insurance coverage, household enumeration and demographic questions will be asked during the first interview; in addition, reference dates will be established in Rounds 76, 79 and 82 and so forth for those individuals new to the MCBS. The core components are administered in every round thereafter. After the first interview, we administer the core questionnaire in addition to topic specific modules.

The literature (initially reported by Neter and Waksberg in 1964, and confirmed in subsequent research by other analysts) indicates that collection of behavioral information in an unbounded recall period can result in large recall errors. A part of the initial interview (Rounds 76, 79 and

82) prepares the respondent for the collection of utilization and expenditure information in subsequent rounds, thus “bounding” the recall period for the next interview. In addition, at the conclusion of the initial interview, the respondent (new Supplemental sample only) is provided with a calendar. This calendar marks the recall period for the respondent, serves as the means to record utilization, and as a prompt to retain statements and bills.

  1. Content of the core/continuous questionnaire, Rounds 76-83.

Most of the questionnaires as currently approved by OMB are unchanged. The revision to this OMB package includes the following changes to the Community questionnaire.

Summary of changes beginning in Round 76 (Fall 2016) through Round 83 (Summer 2018):

    • The addition of a new section on Use of Nicotine and Alcohol (NAQ). This new section incorporates current measures of nicotine and alcohol use, updates wording, and moves the measures into a single module.

    • Updating the Health Status and Functioning (HFQ) module to use more current terminology and improve mental health measures as specified in the DHHS standards.

    • Moving some items from HFQ into Preventive Care (PVQ) and updating terminology.

    • Reducing the number of items asked in the Patient Perceptions of Integrated Care /Usual Source of Care (PPIC/USQ) section.

    • Adding a question to the Demographics and Income (DIQ) module that measures English literacy among respondents with Limited English Proficiency.

Community Questionnaire.

Introduction (INQ), Enumeration (ENS), Address Verification (AVQ), and Housing Characteristics (HAQ) sections. In the initial Supplemental interview, the MCBS collects information on the household composition, including descriptive data on the household members such as age, gender and relationship to the respondent. The respondent is asked about their housing situation and living arrangements. We also verify the respondent’s address and telephone number. This information is updated in each subsequent Continuing interview round.

Health Insurance (HIQ). In the initial interview, we collect information on all sources of secondary health insurance, both public and private, which cover the respondent. Included are questions about premium, coverage, primary insured, source of the policy (i.e., private purchase, employer sponsored, etc.) and type of health care delivery system. This information is updated in each subsequent round.

Utilization Series. This section collects information on the respondent’s use of medical services. We specifically ask about utilization of the following services: Dental Utilization (DUQ), Emergency Room Utilization (ERQ), Inpatient Utilization (IPQ), Outpatient Utilization (OPQ), Institutional Utilization (IUQ) - skilled nursing home services, intermediate care facility services, etc.-, Home Health Utilization (HHQ), Medical Provider Utilization (MPQ) - medical doctors, chiropractors, physical therapist, etc. -, Prescribed Medicine Utilization (PMQ) and Other Medical Expenses (OMQ). For each type of service reported, we collect information on the source of care, type of provider, date that the service was provided, and if medications were prescribed as a part of the event. This episodic information is collected for all services since the date of the last interview.

Charge Series: Statement and No Statement Series. These sections collect information on costs, charges, reimbursements and sources of payment for the health care services reported in the Utilization Series. If a respondent has an insurance statement (Medicare Summary Notice or

private health insurance statement) for a reported medical service, then the Statement Charge (STQ) series is administered. For reported medical utilization, if a respondent indicates that a statement has not been received, but they expect to receive a statement, we defer asking about this service until the statement is received. If the respondent doesn't have and doesn't expect to receive a statement, the No Statement Charge Series (NSQ) series is asked. Questions are asked about the cost of the services, charges, expected reimbursement, and potential or actual sources of payment (including other family members).

Summary Information. Updates and corrections are collected through the summaries – the Enumeration Summary (ENS), Health Insurance Summary (HIS), Prescribed Medicine Summary (PMS), and Charge Payment Summary (CPS). For the enumeration, insurance and utilization sections, the respondent is shown the information reported or updated in the previous round on the field interviewer’s laptop. The respondent is asked to review this and inform the interviewer about any corrections or modifications. Updates to prescribed medication use can be made at this time in the Prescribed Medicine Summary (PMS). In addition, information for events that remain open in the previous round (i.e., the respondent expects to receive a statement, but had not received a statement at the time of the last interview), is collected in the Charge Payment Summary (CPS). Information is collected through this summary in a manner that is consistent with the Statement or No Statement series. As a note, MCBS respondents are given a calendar/planner that is used to help them track all health care cost and use.

Facility Questionnaire.

The Facility component collects information that is similar in content to the Community interview. Sections of the Facility instrument parallel the Community instrument (i.e., residence history parallels the Community household Enumeration section). The provider probes capture information that is similar to the Community Utilization Series and the institutional charge series parallels the Community Charge Series (Statement and No Statement series). Differences in the Facility and Community components result from differences in the setting of the interview and the types of respondents. The Facility questionnaire is administered by the interviewer to one or more proxy respondents designated by the facility director. The Community instrument is administered to the respondent or their designated proxy. Both the Community and Facility interviews are record driven. However, the Facility respondents refer to formal medical care records, while in the Community, the respondent is dependent on their own record keeping. The core Facility instrument contains the following sections:

Residence History. This section collects continuous information on the residence status of the sample person, including current residence status, discharge and readmission.

Health Services. This section collects information on medical use by type of service. Type of providers and setting used are identified for reported medical events. In addition information is collected on the number of times or volume of care received.

Prescribed Medicines. All medications administered in a facility are prescribed. Information is collected on the name, form, strength, and dispensing frequency of the medication.

Inpatient Hospital Stays. Information is collected on any inpatient hospital stays reported in the Residence History.

Institutional charges. This section collects information from the institutions on the charges, reimbursement levels and sources of payment for the sample person. Information on bad debt and other sources of differences between bills and payments.

  1. Content of topical modules. The MCBS interview consists of core items and one or more topical modules. The content of the modules is determined by the research needs of CMS, the Department, and other interested organizations/agencies, including the Medicare Payment Advisory Commission. Topics for the Community component include: income, assets, program knowledge and participation, demographic information, health and functional status, satisfaction with care, and usual source of care. For the Facility instrument topical supplements include the eligibility screener and the baseline instrument (contains questions on demographics and income, residential history, health status and functioning, type of housing and health insurance).

For the Community interview we are requesting clearance to continue to field the modules asked in the Fall round: Usual Source of Care/Patient Perceptions of Integrated Care (USQ/PPIC), Access to Care (ACQ), Satisfaction with Care (SCQ), Health Status and Functioning (HFQ), Health Insurance (HIQ), Household Enumeration (ENS), Housing Characteristics (HAQ), and Demographics and Income (DIQ). A module on Knowledge and Information Needs (KNQ) is asked in the Winter round along with the Preventive Care (PVQ) module. Income and Assets (IAQ), Prescription Medicine (PMQ) and PVQ are asked in the Summer round. For the Facility interview, we are requesting clearance for the eligibility screener and the baseline instrument.

Changes in this clearance request to the Community Questionnaire

OMB previously approved changes to the Community questionnaire made in Rounds 73, 74, and 75 through non-substantive change requests. CMS is requesting the following changes in content to the approved Community questionnaire beginning in Round 76.

    • The addition of a new section on Use of Nicotine and Alcohol (NAQ). This new section incorporates current measures of nicotine and alcohol use, updates wording, and moves the measures into a single module.

    • Updating the Health Status and Functioning (HFQ) module to use more current terminology and improve mental health measures as specified in the DHHS standards.

    • Moving some items from HFQ into Preventive Care (PVQ) and updating terminology.

    • Reducing the number of items asked in the Patient Perceptions of Integrated Care /Usual Source of Care (PPIC/USQ) section.

    • Adding a question to the Demographics and Income (DIQ) module that measures English literacy among respondents with Limited English Proficiency.

The goal of most of these changes is to bring the MCBS questionnaire in line with other national surveys (e.g., the National Health and Aging Trends Study (NHATS) and the National Health and Nutrition Examination Survey (NHANES)) that have more current wording of questions and response categories with well-established measures. In addition, the HHS Data Council has issued guidelines and new standards for measurement of certain topics which require revisions to the MCBS. Specifically:


  1. Use of Nicotine and Alcohol Module (NAQ): The MCBS has always asked questions about nicotine and alcohol use. The nicotine and alcohol use questions have traditionally been a part of the Heath Status and Functioning module. This change pulls those measures out and updates them so that they follow the wording and format of other national health surveys as well as adhering to HHS Data Council Standards. Thus, a new Use of Nicotine and Alcohol (ASQ) module has been added to the Community instrument starting in Round 76. These measures provide important information on the health status and health care utilization of Medicare beneficiaries. The MCBS has currently approved questions on nicotine use and alcohol. This new section moves these measures into a single module and includes items on cigarette smoking, use of smokeless tobacco products, cigars, and pipes, use of e-cigarettes, and alcohol use. The cigarette smoking and alcohol use items are recommendations from the Department of Health and Human Services (DHHS). The items on smokeless tobacco products cigars, and pipes are from the National Survey on Drug Use and Health (NSDUH). There is one item on the use of e-cigarettes based on the final recommendation by the HHS Data Council Tobacco working group.

  2. Health Status and Functioning (HFQ): This section largely remains unchanged with a few exceptions. This section has been revised to add an item asking whether the respondent has ever been contacted by a collection agency because of problems paying medical bills and to revise the language for two items asking about intellectual disability to align with current national standards on how intellectual disability is referred to. The language of the physical functioning items has been updated to bring it in line with the items used on the NAGI disability scale, the standard used for geriatric respondents and used on other federally supported studies such as the Health and Retirement Study (HRS), sponsored by the NIA and NIH. We have also modernized and aligned many health related chronic illness and disease related questions with the NHIS including: our specific cancer diagnoses,the language in the arthritis items, the language in the stroke items, and the heart disease items. These modifications are to align the MCBS with standard language used in other federal surveys asking these same questions, whenever possible. Finally,, updated measures of mental health will be added to the HFQ. These include items on depression that are part of the PHQ depression screening items and items on anxiety from the GAD screener. Both are included as recommended from the HHS guidelines for surveys wanting to include analytic co-variates on depression and anxiety.


  1. Preventive Care (PVQ): The items on preventive care in the HFQ module have been moved into the Preventive Care (PVQ) module. They include the items on blood pressure, cholesterol, mammograms, pap smears, hysterectomy, prostate surgery, digital rectal examination, and PSA test.


  1. Patient Perceptions of Integrated Care (PPIC/USQ): In Round 73, CMS consolidated usual sources of care items with other questions that measured perceptions of integrated care. In Round 76, only a subset of these questions will be asked, with the goal of reducing the administration time for the section by deleting the combined USQ/PPIC administered in Round 73 and reverting to the shorter USQ version administered in Round 70. This entails removing the set of items comprising the PPIC supplement that were added to the questionnaire in Round 73, including questions about the respondent’s opinions of the care they are receiving, whether the care is centered on the patient, and the level of communication between their medical providers. However, the shorter USQ module retains the updated “doctor or other health professional” terminology, to standardize the medical terminology used throughout the questionnaire. CMS will explore ways to potentially shorten and conduct cognitive testing of the PPIC supplement before possible future re-introduction into the MCBS questionnaire.


  1. Demographics and Income (DIQ): Under the MCBS Generic Clearance for Questionnaire Testing and Methodological Research clearance (OMB No. 0938-1275, exp.05/31/2018), cognitive research was conducted to test expanding the existing three questions on Limited English proficiency (LEP). As a result of this research, a new question will be added that will measure English literacy (e.g., how well does the sample person read English). Additional research under this generic clearance was also conducted to test new questions on sexual and gender identity status. These questions will not be added to the MCBS due to the projected low sample size that will result in insufficient power for analysis. Additional background information is being provided on both of these cognitive testing protocols and the decisions made.

It is likely that some Community and Facility questionnaire sections and modules will be redesigned during the 3-year clearance period, including moving, deleting, and consolidating items and modules, with the goal of reducing burden. Most of these changes will be minor, reflecting improved wording of questions and response categories; many of the changes will substitute current MCBS questions with similar questions taken from other national surveys. Any of these non-substantive changes will be submitted to OMB for approval. If the questionnaire changes are deemed substantive, CMS will submit a new request for revision of the currently approved collection.

  1. Rounds 76 through 83 data collection procedures.

  1. Interviews with sample persons in community. In Round 76, 79 and 82, all newly selected respondents will be sent a Community Advance Letter (Attachment 2) from the Centers for Medicare and Medicaid Services. Field interviewers will carry copies of the advance letter for respondents who do not recall receiving one in the mail, as well as a copy of the MCBS Community Brochure and At the Door Sheet (Attachment 2). This process was and will continue to remain effective.

The Community interview (Rounds 76-83) will be administered to the respondent or a designated proxy using a computer-assisted personal interviewing (CAPI) program on a laptop computer. A hard-copy representation of the continuous core for Rounds 76-83 CAPI interview for persons living in the community is shown in Attachment 3. Attachment 4 includes a copy of the instrument that is administered in the initial Supplemental interview, the Continuing interview, and the Showcards, used by the interviewer to assist in the interviewing process.

At the completion of the Supplemental interview (Rounds 76, 79 and 82), each new respondent is provided with a MCBS calendar (Attachment 2), on which he or she is encouraged to record health care events. The same calendar is provided to all Continuing Community respondents on a calendar year basis.

  1. Interviews with sample persons in institutions. For the initial Facility interview the Eligibility Screener, Baseline and Core Questionnaires are administered. All Facility interviews are administered to facility staff using a CAPI program on a laptop computer. For all facility residents, the Facility Eligibility Screener is administered during the Fall of each year (Attachment 5). The Facility Baseline and Core questionnaires to be used in Rounds 76-83 are shown in Attachment 6.

Some facility administrators will require consent of the sample person or a next of kin before releasing any information. The data collection contractor will offer to obtain such written consent, using the Resident Consent Form, Next of Kin Consent Form, and HIPAA letter included as Attachment 7.

  1. Proxy rules.

For Community respondents, the preferred mode is self-response. During the initial interview (with subsequent updates), respondents are asked to designate proxy respondents. These are individuals who are knowledgeable about the respondent’s health care and costs and expenditures for this care. In the MCBS, only those individuals who are designated by the respondents can serve as proxy respondents.

The facility setting presents a different and changing set of circumstances for the MCBS. In the past the MCBS used the policy of making no attempt to directly interview residents in a facility. But, changes in elderly care have interviewers encountering facilities, which provide a wider range of services that fall outside the scope of traditional Medicare certified facilities. In some cases, such as custodial care and assisted living communities, the best person for answering our questions is the beneficiary, rather than facility staff. MCBS interviewers are now trained to determine and seek out the appropriate source for interviewing. For persons who move in and out

of long-term care facilities, standard procedures will be used to determine the best respondent to provide data about the period spent outside of such facilities. If a respondent is incarcerated, we will not seek self-response within a prison, but rather monitor the respondent’s incarceration status should the person be released. Other institutions will be treated on a case-by-case basis.

B3. Methods for Maximizing Response Rates and Deal with Issues of Non-Response


MCBS is sampling a heterogeneous population that presents a unique challenge for maximizing response rates. The survey selects respondents from two groups—aged and disabled Medicare beneficiaries—who have characteristics that often lead to refusals on surveys. Increasing age, poor health or poor health of a family member are prevalent reasons for refusal. On the other hand, older persons are the least mobile segment of the population and thus less likely to be lost due to failure to locate. The disabled population tends to have a slightly higher response rate than the aged population. While the percentage of sample losses due to death is comparable to that of the 70-74, 75-79 and 80-84 age brackets, refusal rates are the lowest of all age categories.

Because this is a longitudinal survey it is essential that we maximize the response rates. In order to do so, survey staff undertakes an extensive outreach effort annually. This includes the notification of government entities (CMS regional offices and hotline, carriers and fiscal intermediaries, and Social Security Offices), national organizations including the American Association of Retired Persons, The Arc of the United States, and various community groups (e.g., mayor's offices, police, social service and health departments, home health agencies, state advocates for the elderly and area agencies on aging). These efforts are undertaken to increase the likelihood that respondents would answer the MCBS questions and remain in the survey panel by: 1) informing authoritative sources to whom respondents are likely to turn if they suspect the legitimacy of the MCBS; 2) giving interviewers resources to which they can refer to reassure respondents of the legitimacy/importance of the survey; and 3) generally making information about MCBS available through senior centers, other networks to which respondents are likely to belong and through the CMS website.

In addition to outreach, the following efforts remain in place to maintain a sense of validity and relevance among the survey participants.

  1. An advance letter is sent to both potential sample persons and facility administrators from CMS with the Privacy Officer's signature. This includes an informational brochure answering anticipated questions.

  2. A handout with Privacy Act information and an appeal to participate is given to the respondent at the door by the interviewer.

  3. Interviewer training emphasizes the difficulties in communicating with the older population and ways to overcome these difficulties.

  4. Individualized non-response letters are sent to respondents who refuse to participate. These letters are used when deemed appropriate by the field management staff. CMS staff follows up with respondents who refused because of concerns about privacy and federal sponsorship of the survey.

  5. Proxy respondents are sought for respondents unable to participate for themselves.

  6. Non-respondents are re-contacted by a refusal conversion specialist.

  7. A dedicated project email address and toll-free number is available at NORC to answer respondent's questions.

  8. An MCBS website maintained by NORC contains information for respondents on the project.

  9. An E-mail address and website are available at CMS to answer respondents’ questions.

  10. Respondents receive a quarterly MCBS newsletter, which includes information about the survey, as well as seasonal topics such as winter safety tips for seniors. Attachment 2 contains an example of a recent newsletter.

  11. Whenever possible, the respondent is paired with the same interviewer throughout the survey. This maintains rapport and establishes continuity of process in the interview.

  12. Periodic feedback mechanisms have been established. These include describing the availability of data, types of publications presenting MCBS data and preliminary findings presented in the form of data summaries.

  13. We encourage personal touches, including interviewer notes and birthday cards.


In contrast to most surveys, the MCBS has a large amount of information to characterize non- respondents. This information, including Medicare claims data, can be used for imputation if necessary. To minimize the risk of bias from non-response, models predicting the propensity not to respond are based upon the extensive administrative databases available and upon data from each earlier round. We then use propensity to respond to form cells to adjust respondent weights. Simultaneously, the substantive characteristics of non-respondents will continue to be tracked in the administrative databases to monitor the risk of bias.


Over the rounds we have identified the following patterns of nonresponse, which have or have not changed over time. In the most recent three rounds we observed the following: the round-level response rates for continuing panels remains high, ranging from 78.5% for the 2013 panel in Round 70 to 99.6% for the 2010 panel in Round 69. Despite these high rates, each year continuing panels are subjected to a nonresponse adjustment based on new response propensity models by panel. Supplemental panels at the first interview (e.g., the 2014 panel at Round 70, show a larger propensity for nonresponse due to having never been reached prior to the first interview. In Round 70 the response rate for the 2014 supplemental panel was 58.7%. Once again we rely on cells derived from response propensity models to account for differential effects of demographic and geographic characteristics on the resulting data. In 2014 the most closely related covariates to response propensity in the supplemental panel were: the mean response rate over the previous 5 years in the same county; race (2-level black, non-black); entitlement for Part B (2-level yes, no); age category (7-level <45, 45-64, 65-69, 70-74, 75-79, 80-84, 85+); and 4-level Census region. By accounting for these characteristics in constructing the adjustment cells, we reduce the potential for nonresponse bias that could arise due to these differential factors.


B4. Tests of Procedures or Methods

MCBS’ generic clearance for Questionnaire Testing and Methodological Research for the MCBS was approved by OMB in May 2015 (OMB No. 0938-1275, expiration 05/31/2018). The generic clearance encompasses development and testing of MCBS questionnaires, instrumentation, and methodological experiments. It contains approval for seven types of potential research activities: 1) cognitive interviewing, 2) focus groups, 3) usability testing, 4) field testing, 5) respondent debriefing questionnaire, 6) split ballot and other methodological experiments, and 7) research about incentives.


Using this generic clearance, cognitive research was conducted to test expanding the existing three questions on Limited English proficiency (LEP). As a result of this research, a new question will be added that will measure English literacy (e.g., how well does the sample person read English). Additional research under this generic clearance was also conducted to test new questions on sexual and gender identity status. These questions will not be added to the MCBS due to the projected low sample size that will result in insufficient power for analysis. Additional background information is being provided on both of these cognitive testing protocols and the decisions made.


Any future changes to the MCBS instrumentation, data collection methods, or procedures that require testing will be submitted as individual collection requests under the generic clearance. The test results will be evaluated and reported to OMB before they are proposed as changes to the approved MCBS collection.

B5. Individuals Consulted on Statistical Aspects of Design


The person responsible for statistical aspects of design is: Kirk Wolter, Ph.D.

Executive Vice President

NORC at the University of Chicago 55 East Monroe Street, 30th Floor Chicago, Illinois

(312) 759-4206

[email protected]


The contractor collecting the information is NORC at the University of Chicago.

Shape1

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSusan Schechter Bortner
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy