Supporting Statement (C)

Supporting Statement (C).doc

Medicare Current Beneficiary Survey (MCBS): Rounds 48-56 (CMS Number CMS-P-0015A)

OMB: 0938-0568

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT

REQUEST FOR CLEARANCE:

MEDICARE CURRENT BENEFICIARY SURVEY

ROUNDS 48 THROUGH 56


Table of Contents


C. Collection of Information Employing Statistical Methods 2

C..1 Overview of the MCBS structure 2

C..2 Description of sample selection and universe 5

C..3 Data collection procedures 8

C..4 Methods to maximize response rates 18

C..5 People responsible for statistical aspects of design 20


REFERENCES 21


LIST OF ATTACHMENTS 22


C. Collection of information employing statistical methods


C..1 Overview of the MCBS structure


C..1.a Continuation of the Basic MCBS Structure. We will continue to administer and operate the MCBS in the same manner as in Rounds 1-47. The MCBS will consist of four month long rounds with each respondent (or proxy) interviewed three times per year. In the community, we will continue to conduct a new sample person's first interview in the September through December round using the initial instrument, which contains the following components and supplements: Introduction, Enumeration, Health Insurance, Usual Sources of Care, Satisfaction with Care, Health Status and Functioning, Provider Probes, and Demographics. Please see Attachment V for the Household components and Attachment VIII for the facility components.


We will continue to use the initial interview for each entering cohort to initiate the reference period for the collection of health expenditure data. Every interview after the initial interview will collect cost and use information, using the preceding interview as a boundary for maintaining health care use and cost information (“since the last time I was here...”). Beginning with Round 2 of the MCBS, we initiated the collection of detailed health care use and cost information. This is the core or continuous part of the survey. With the exception of the replenishment sample, who are not asked about use and cost of services until the round after their initial interview, Rounds 2 through 47 collected cost and use information for medical, dental, hospital, nursing home, home health, prescription drug, and other medical expenses. This process will continue.


We will continue to collect cost information on an individual visit‑by‑visit basis, through the use of the “statement” and the “no statement” series of questions. If a respondent has received a Medicare or other public or private statement, the statement series will be administered. If the respondent indicates that they expect to receive a statement for a specific service, the statement series for that service will be postponed until the next round. The no-statement series will be administered if a respondent has not received, or failed to keep, a statement for a particular service and does not expect to receive anything further on the service. The no statement series will continue to rely on an individual's records, including bills, checks and receipts, with the interviewer entering amounts as the respondent reports them.


As in Rounds 1 through 47, for all rounds following the initial collection of detailed cost and use information the survey instrument includes a utilization summary that is integrated into the body of the instrument. The summary helps bound respondent recall and collect modifications or updates to service‑specific use reported in the last round. Billing and reimbursement processes are different for sample persons in institutions, and a summary of previously submitted information is not reviewed with the respondents.


C..1.b Benefits Derived From the Use of a Rotating Panel Design. Beginning with Round 10, we began to implement a rotating panel design. This design change was intended to reduce both respondent burden and biases that may result from the cumulative attrition of the

original or continuous sample. With the rotating panel design, we maintain a respondent for 12 interviews, which yields three complete calendar years of data. This allows both cross-sectional and longitudinal analysis, as well as, estimates of net change. Figure C.1 gives a visual display of the panel design, starting with Round 47.


Figure C.1 Rotating Panel Design





Our plan is to replace approximately four thousand sample persons each year. This results in three overlapping rounds of data collection for sample persons being phased out and those selected to replace them, resulting in a total of four panels at any one time. This overlap represents the overhead needed to begin and end an individual's participation in the MCBS. One of the overlapping rounds results from the fact that the first round of the MCBS is a baseline interview conducted prior to the start of the calendar year. The other two overhead interviews results from the fact that persons being retired from the survey must be interviewed for two rounds after the end of the calendar year, in order to obtain complete utilization and expenditure information for the year.


The initial large panel of 15,411 beneficiaries was fielded in the fall of 1991. Smaller supplemental panels were added in the fall of 1992 and 1993. These supplementary panels afforded a chance of selection to beneficiaries who became entitled to either part A or Part B benefits during 1991 and 1992 in addition to maintaining adequate sample sizes in the face of death and sample attrition. At the time that the first panel was fielded, no definite decision had been made on how many years to interview sample beneficiaries.

In 1993, a decision was made phase out the 1991, 1992 and 1993 Panels after no more than six years of interviews and to limit future panels to four years of interviews. This meant that the new sample to be selected for 1994 had to be designed like the 1991 Panel so that it could eventually replace it, rather than being narrowly focused as the 1992 and 1993 Panels were.


At the same time, a decision was made to increase the overall sample size in terms of interviews per year in order to allow the simultaneous interviewing of four panels, each starting with about 6,400 sample beneficiaries. In Round 10 (September-December 1994), we began implementation of the rotating panel process with the 1994 Panel. This group consisted of 6,390 beneficiaries, including a sample of those who became entitled during 1993 or on January 1, 1994. Like the first rotating panel (Round 10), all subsequent panels are designed to be nationally representative samples of the current Medicare population. The following bullets describe panel composition for each of the four panels currently participating in the MCBS.


  • In Round 37 (September-December 2003) the tenth rotating panel was selected, consisting of 6,300 Beneficiaries. This panel was the third to be selected from the new (updated) sample of PSUs.

  • In Round 40 (September-December 2004) the eleventh rotating panel was selected, consisting of 6,342 Beneficiaries. This panel was the forth to be selected from the new (updated) sample of PSUs.

  • In Round 43 (September-December 2005) the twelfth rotating panel was selected, consisting of 6,565 Beneficiaries. This panel was the fifth to be selected from the new (updated) sample of PSUs.

  • In Round 46 (September-December 2006) the thirteenth rotating panel was selected, consisting of 6,675 Beneficiaries. This panel was the sixth to be selected from the new (updated) sample of PSUs. In addition to the 6,675 beneficiaries in the regular panel, a supplemental sample of 1,490 new enrollees was selected for a special assessment of a free-physical benefit introduced in 2005. However, unlike the regular panels, the new enrollee supplement will be followed for three years, rather than the four years specified under the MCBS design as described below.


A rotating panel will be followed for 12 interviews. There are four panels active at any one time, and each panel has approximately 4,000 active sample persons. New panels that are introduced each year in the fall round will replace the oldest panel that will subsequently be retired in the following summer.


Because of the overlap between the new panel and the retiring panel, the number of interviews we conduct in the September—December (Fall) round increases roughly from 12,000 to 16,000. Figure C.1, while not drawn to scale, gives a visual display of the overlap that occurs during the simultaneous fielding of four panels in the fall round, and the special one round supplement.


The retiring panel (about 4,000 individuals) has abbreviated questionnaires administered beginning in the January to April round and, if necessary, May to August (summer) round to complete the collection of medical events occurring in the previous calendar year. These sample persons were then rotated out of the study. These individuals participated a maximum of four years (that is, a baseline interview, three complete years of utilization and expenditure data, and up to two interviews to "close out" events due to late arriving paperwork).


Each Fall round, under the rotating panel design, a new panel will be introduced and each Summer round a panel will be retired. Thus, for example, the new panel that was introduced in Round 46 will replace approximately 4,000 of the ongoing sample by Round 48. This rotating panel sample design allows for both the eventual termination of participation in the study for individuals and for the completion of about 12,000 interviews for an ongoing study population.


C..2 Description of sample selection and universe


C..2.a Potential respondent universe. The target universe is current Medicare beneficiaries entitled to hospital and/or supplementary medical insurance, living in the 50 states and in Puerto Rico. Both institutionalized and noninstitutional beneficiaries are represented. The target universe is computed as of the mid-point of the calendar year. The universe has been divided into the 7 sampling strata shown in Table C.1.


Table C.1: Universal Counts Broken Down by MCBS Sampling Strata

─────────────────────────────────────────────────────────────────

Age 2002 2003 2004 2005 2006 2007

Interval (est.) (est.)

(in thousands)

Disabled

0 ‑ 44 1,605.0 1,639.8 1,673.8 1,714.8 1,735.3 1,756.1

45 ‑ 64 4,077.9 4,321.0 4,570.0 4,885.1 5,042.6 5,205.2

Total 5,682.9 5,960.8 6,243.7 6,599.9 6,777.9 6,961.3


Aged

65 ‑ 69 8,575.1 8,795.2 8,986.4 9,209.1 9,320.4 9,433.1

70 ‑ 74 8,454.2 8,352.7 8,358.6 8,382.3 8,394.2 8,433.1

75 ‑ 79 7,345.8 7,368.4 7,287.5 7,334.5 7,358.1 7,381.6

80 ‑ 84 5,331.6 5,433.7 5,547.4 5,602.8 5,630.6 5,658.4

85+ 4,728.8 4,833.6 4,897.7 5,064.5 5,147.9 5,232.7

Total 34,435.4 34,783.7 35,077.7 35,593.3 35,851.1 36,111.9


Total 40,118.3 40,744.5 41,321.4 42,193.2 42,629.1 43,073.2

──────────────────────────────────────────────────────────────────────────────

Source: Historical counts (2002-05) are based on CMS administrative records. Projections (2006-07) from the historical counts are based on the average annual rate of change from 2004‑05. Distributions by age interval are estimated. Totals do not necessarily equal the sum of rounded components.



C..2.b Sample Sizes.


(i) Target sample sizes. The target sample size of the MCBS is designed to yield 12,000 completed cases a year (approximately 2,000 disabled enrollees and 2,000 enrollees in each of 5 age strata for the over 65). To achieve the desired number of completed cases, the MCBS selects new sample persons each year to compensate for attrition and retirement of sample people, and to include the newly eligible population while continuing to interview the non-retired portion of the continuing sample.


The MCBS adds approximately 6,450 – 6,600 beneficiaries to the sample in the September - December round each year to replace the existing panel and to offset sample losses due to non-response. We retired approximately 4,000 sample persons in the May - August round each year. As a result, the sample size averages approximately 16,500 interviews per round, which yield approximately 12,000 cases with completed annual utilization and expenditure information. This is composed of sample persons being retired (about 4,000) and those whose experience will be used to create the calendar year file.


Sample persons who refuse one or more rounds or who cannot be located for one of the scheduled interviews are not counted as completed cases. Proxy interviews are attempted for deceased sample persons. If data are collected through the date of death, then such cases are counted as completes. For sample persons who reside in both a community and a facility setting, the round is considered complete, if community and facility interviews are completed.


Sample persons remain in the survey when they are unavailable for an interview in a given round; that is they are carried forward into the next round. For these individuals the reference period for their next core interview covers the period since their last interview, so that there will not be a gap in coverage of utilization and expenditure data. Supplements are administered for the current round only. If a sample person is unavailable two rounds in a row, they are not scheduled for any further follow-up because extension of the recall period beyond eight months is not feasible.


A broad range of statistics is produced from the MCBS. Robustness and generality have been stressed in sample design rather than customizing for specific goals. We anticipate that we will continue to over-sample the extreme elderly and the disabled. The methodology for drawing the samples is described in detail in C.3.a. Originally, designated sample sizes were larger than targeted completes to compensate for initial non-response and ineligibility. To see an illustration of this compensation see Table C.2. This approach will continue, i.e., sample sizes will be larger than targeted completes in Rounds 49, 52, and 55.

Table C.2: Compensate for Initial Non-Response and Ineligibility


Desired average

number of cases Number to

Age on providing be sampled

7/1/2006 annual data at Round 46

──────────────────────────────────────────

0-44 333 615

45-64 333 515

65-69 667 1,320

70-74 667 915

75-79 667 1,095

80-84 667 1,080

85+ 667 1,135


Total 4,001 6,675

(ii) Cross-sectional sample sizes for other domains. As a result of the sampling process, there are multiple domains in the MCBS, for example, respondents with end-stage renal disease, persons residing in nursing homes (i.e., approximately 1,068 sample persons resided in facilities for some part of Round 43), managed care enrollees, race and ethnic background, and Medicaid recipients. The MCBS will continue to maintain a minimum target of 12,000 completed responses annually. This will facilitate maintaining the number of domains and insuring that analysis can be performed on other domains included in MCBS.

(iii) Sample sizes for longitudinal analyses. We will depend on the historical response rate maintained by the MCBS (discussed in Section C) and upon mortality statistics, to determine the rotational sample size and configuration. The rotational sample design provides consistency in time across all panels. Respondents will remain in the sample for twelve interviews, as part of the rotating panel design.


C..2.c Response rates. By Round 44, 65 percent of the 2002 panel were still in a formal responding status (that is, either the SP was alive and still participating or had died after Round 34 but left behind a cooperative proxy for the collection of data on the last months of life) or had participated in the survey until death, leaving enough data to estimate the last months of life. For the 2003 and 2004 panels, the corresponding figures were 67 and 71 percent, respectively. The 2005 panel (Round 43) had an initial response rate of 82 percent.


There were 3,277 interviews successfully completed at Round 43 with still-living members of the 2002 panel. For brevity, we refer to these 3,277 interviews as “live completes”. For the 2003 and 2004 panels there were 3,473 and 3,958 live Round 43 completes, respectively.


A comparison of weighted and cumulative response rates for Rounds 1-12 of the MCBS 1998 to 2005 Panels are shown in Table C.3. Please refer to Figure C.1 [C..1.b] for a timeline of the MCBS rotating panel design.


The MCBS has used a variety of techniques to maintain respondents in the survey and reduce attrition. These will be continued and adapted to comply with the time frames for initiating and implementing the continuous sample. These are described in detail in section C..3.


C..3 Data collection procedures


This section describes the procedures used for the national survey. It includes a general discussion of the statistical methodology for stratification and rotational panel selection, estimation procedures, and the degree of accuracy needed. This is followed by a presentation of how topical supplements will be used to reduce respondent burden. The content of the continuous or core questionnaires is then summarized. Finally, there is a discussion of rules for allowing proxy response.


C..3.a Statistical methodology for stratification and sample selection. This section opens with a description of the sample, consisting of metropolitan areas and clusters of non-metropolitan counties. Within these areas, the sample continues to be concentrated in clusters of ZIP code areas (5 digit). New ZIP fragments will be sampled within the PSUs each Fall. This is followed by a general discussion of the selection of the original and supplemental samples. We are using a different five percent of the HISKEW each year, thus reducing the problems associated with duplication of samples across the years. The total sample will be stratified by PSU, ZIP cluster, age domain, sex and race.


Table C.3: Conditional Response Rates for Medicare Current Beneficiary Survey by Interview Round


1998 Panel 1999 Panel 2000 Panel 2001 Panel 2002 Panel 2003 Panel 2004 Panel 2005 Panel

Response Response Response Response Response Response Response Response

Rate Rate Rate Rate Rate Rate Rate Rate

(n=6,082) (n=6,085) (n=6,037) (n=5,968) (n=5,967) (n=5,930) (n=5,978) (n=6,565)

Round 1 83.3% 84.8% 84.3% 84.8% 84.3% 83.2% 82.2% 84.3%

Round 2 94.5% 94.3% 94.1% 93.1% 92.9% 92.9% 92.8%

Round 3 96.7% 96.5% 96.9% 96.1% 96.6% 95.5% 95.9%

Round 4 97.2% 97.5% 97.3% 96.1% 96.9% 96.1% 96.6%

Round 5 97.8% 97.9% 97.6% 97.1% 97.8% 97.8%

Round 6 98.0% 98.3% 97.8% 97.9% 97.9% 97.7%

Round 7 98.6% 98.3% 98.6% 98.1% 97.1% 98.3%

Round 8 98.7% 98.7% 98.1% 98.1% 98.3%

Round 9 98.4% 98.7% 98.6% 98.6% 98.5%

Round 10 98.8% 98.9% 99.2% 98.0% 98.7%

Round 11 99.3% 99.3% 99.4% 99.0%

Round 12 99.7% 99.8% 99.7% 99.8%



Cumulative Response Rates for Medicare Current Beneficiary Survey by Interview Round


1998 Panel 1999 Panel 2000 Panel 2001 Panel 2002 Panel 2003 Panel 2004 Panel 2005 Panel

Response Response Response Response Response Response Response Response

Rate Rate Rate Rate Rate Rate Rate Rate

(n=6,082) (n=6,085) (n=6,037) (n=5,968) (n=5,967) (n=5,930) (n=5,978) (n=6,565)

Round 1 83.3% 84.8% 84.3% 84.8% 84.3% 83.2% 82.2% 84.3%

Round 2 78.7% 80.0% 79.4% 79.0% 78.3% 77.3% 76.3%

Round 3 76.1% 77.1% 76.9% 75.9% 75.6% 73.9% 73.2%

Round 4 73.9% 75.2% 74.8% 72.9% 73.3% 71.0% 70.7%

Round 5 72.3% 73.6% 73.1% 70.8% 71.6% 69.4%

Round 6 70.9% 72.4% 71.5% 69.3% 70.1% 67.8%

Round 7 69.9% 71.2% 70.5% 68.0% 68.0% 66.6%

Round 8 69.0% 70.2% 69.1% 66.7% 66.9%

Round 9 67.9% 69.3% 68.1% 65.8% 65.9%

Round 10 67.1% 68.5% 67.6% 64.5% 65.0%

Round 11 66.6% 68.1% 67.2% 63.8%

Round 12 66.4% 67.9% 67.0% 63.7%

(i) PSU and ZIP code clustering. The MCBS is in the final phase of implementing its PSU redesign. The original MCBS sample was spread across 107 primary sampling units (PSUs), which are metropolitan areas and groups of non-metropolitan counties. Within the PSUs, the initial sample was concentrated in 1,163 second-stage units consisting of clusters of ZIP code areas (5 digits). With the introduction of the 1992 and 1993 supplements, the number of sample ZIP code clusters expanded to 1,366 and 1,412, respectively. Between 1994 and 2001, a total of 111 new ZIP clusters had been added to the sample bringing the total to 1,523 clusters. These PSUs had been randomly selected within fixed strata with probability proportionate to the U.S. population in 1980.

In 1999, CMS and Westat staff began an evaluation of the existing PSU structure. Following the analysis of the aging PSU structure, WESTAT recommended re-selection of the PSUs. Attachment XII(a) is the evaluation of alternative measures of size for PSU selection. The PSU redesign has been phased in over a four-year period in conjunction with the drawing of each new panel. The reselection of the PSUs does not involve increases in sample size, nor increases in respondent burden.


The 2001 panel was the first panel in which the MCBS PSU redesign was implemented. Like the original sample, 107 PSUs were selected of which 63 were retained from the original sample. Within the PSUs, the initial sample was concentrated in 1,209 second-stage units consisting of clusters of ZIP code areas. With the rotating panel design, the PSU redesign has been transparent to data users and no special processing has been required. For more details on the PSU redesign, see Attachment XII(b).


The strata used for selection of the PSUs cover the 50 states, the District of Columbia and Puerto Rico. There are some states without any sample PSUs within their boundaries. Within region and metropolitan status, the strata were defined to be internally homogeneous with respect to socio-economic data from the 1990 Census. The sample PSUs are listed in Attachment XI.


All of the ZIP cluster samples were selected from CMS's master file of beneficiaries enrolled in Medicare, using the beneficiary's address recorded in that file as of March of the year the individual was selected to be in the sample.


Within populous PSUs, the next stage of sampling was to select a sample of ZIP clusters. There were several steps in this sampling process. The first was to form ZIP fragments (the intersections of ZIP code areas and counties in sample PSUs). The second was to assign a measure of size to each ZIP fragment. The measure of size was closely related to the total count of Medicare beneficiaries residing in the ZIP fragment, but beneficiaries in domains to be over-sampled (such as persons over age 84) were counted more heavily than persons to be under-sampled (such as persons aged 66 to 69). Some of the ZIP fragments had very small numbers of beneficiaries residing in them (as low as one). These small ZIP fragments were collapsed with each other and with larger ZIP fragments until the aggregate measure of size for each cluster was large enough to provide a reasonable cluster size for the sample. A sample of these ZIP clusters was then selected with probability proportionate to the measure of size, using systematic sampling with a random start. In a few non-populous PSUs, all ZIP clusters were selected with certainty.


(ii) Selection of beneficiaries. The initial sample of 15,411 beneficiaries was selected from the 5-percent sample of the Health Insurance Master File (HIM). This was a systematic random sample designed to effect uniform sampling weights within analytic domains at the national level. The universe for this sample was stratified by PSU, ZIP code cluster, analytic domain, sex and projected 1991 reimbursements (based upon 1987 baseline reimbursements).

Initially the MCBS sample sizes were reviewed in April of each year. New projections were made of the sample that will be left (alive, located and cooperating) in each of the 7 strata by the year's end. Supplementary samples equal in size to the projected shortfall by stratum were selected in Rounds 4 and 7. Each supplement included an appropriate size sample of the cohort added to the rolls during the prior year. The purpose of these supplementary samples was to constrain expected sampling error of annual cross-sectional statistics within acceptable limits and to expand coverage to the newly eligible.


Beginning in Round 10, with the transition to a rotating panel design, approximately 6,450 beneficiaries (eligible on January 1 of each year) were selected from the HISKEW. This was a systematic sample stratified by PSU, ZIP Code cluster, age domain (projected to July 1, 1995), gender and race. Nursing home residents will continue to be drawn into the sample in exactly the same manner as other beneficiaries.


Since its inception in 1991 and through 2000, the MCBS beneficiary samples have been selected from the same set of 107 nationally representative geographical areas referred to as primary sampling units (PSUs). Over time, the use of the same PSUs can result in losses in both sampling precision and operational efficiency. As part of ongoing reviews of procedures and methods, a number of analyses were conducted to assess the impact of the continued use of the existing PSU sample in subsequent rounds of the MCBS. The analyses examined the increase in the variation in PSU sample sizes resulting from the repeated use of the original MCBS PSUs, the corresponding expected loss in precision resulting from the variation in PSU sample size, the impact of variable PSU sample sizes on survey operations and costs, and the potential improvement in sampling precision using alternative measures of size for PSU selection. Based on those analyses, it was concluded that it would be desirable and feasible to “update” the sample of PSUs to reflect current measures of size and other changes in PSU structure. Hence, in the

Fall of 2000 the MCBS PSU sample was “reselected” for use in the subsequent rounds of the MCBS. The new sample was selected using unbiased procedures that would retain as many of the original PSUs as possible.


C..3.b Estimation procedure. To date, weights have been calculated for Rounds 1, 4, 7, 10, . . 40 and 43 in the Access to Care Series. Weights have also been developed for the Rounds 3 and 6 supplements on income and assets. These weights reflect differential probabilities of selection and were adjusted for under-coverage and non-response. Replicate weights were also calculated so that users can calculate their own standard errors. These replicate weights are available to any analyst upon request. The WesVar software (version 2.12) and user’s guide can be downloaded from Westat’s home page at: www.westat.com. Version 4 of WesVar can also be purchased directly from Westat. The newer version has additional features that are described in detail at the following web site: www.westat.com/wesvar. Special sets of weights for the longitudinal analysis of the access series are also developed for each Access to Care release. Codes exist on each weight file for users who prefer to use an alternative software package such as SUDAAN to estimate variances.

Besides standard weighting and replicate weighting, another part of the estimation program includes the full imputation of the data sets to compensate for item non-response (Attachment XIV). Imputation of charges for non-covered services and sources of payment for covered services in the Cost and Use annual file have been developed. The weighting and imputation of data will continue.


C..3.c Degree of accuracy needed for the purpose described in the justification. As stated in Section C..1, a broad range of statistics will be produced from the MCBS. There is no single attribute of beneficiaries and their medical expenses that stands out as the primary goal of the survey. Thus, there can be no simple criterion for the degree of reliability that statistics for each analytic domain should satisfy. Even with a minimum of 16,000 sample persons, there will be many small domains of interest for which it will be necessary to use modeling techniques or to wait several years for sufficient data to accumulate. Examples include people with specific medical conditions (e.g., hip fractures), institutionalized persons under age 65, Hispanic persons, and sample persons experiencing spend down.


The MCBS will maintain a stratified approach to the selection of the sample. The sample will continue to be clustered by PSU and ZIP Code and stratified by age domain. We anticipate maintaining a total of 2,000 annual cases allocated to the disabled. The two age categories were selected because they indirectly reflect the means by which the disabled person becomes eligible for Medicare. Since the number of disabled sample persons per PSU and ZIP code will be small, the effects of clustering on statistical precision should be mild. It is anticipated that post-stratification by characteristics in CMS databases will more than compensate for the effects of clustering. Thus, with an effective sample size of 1,000 or more for each age stratum, accuracy for each of the two age strata should not be much different from that commonly attained in public opinion surveys. Since many of the statistics may be heavily right-skewed, the accuracy may be lower in relative terms but still acceptable.


Each of the age strata for the aged Medicare sample will be allocated 2,000 cases. A major reason for over sampling the very old is to obtain an adequate sample of nursing home stays, while minimizing design effects. Variations in sampling weights across the age strata and clustering should result in an effective sample size of approximately 1000 cases annually per stratum.


C..3.d Interview content for periodic data collection cycles to reduce burden.


(i) Content and timing of the continuous or core interview. The primary variables of interest for the MCBS are the use and cost of medical care services and associated sources and amounts of payment. While Medicare claims files supply information on billed amounts and Medicare payments for covered services, the survey provides information on use of services not covered by Medicare and on payment sources and amounts for costs not reimbursed by Medicare. For both the household and facility core components, the primary focus of the data collection is on use of services (dental, hospital, physician, medical providers, prescription medication and other medical services), sources and amounts of payment, and health insurance coverage. The “core” MCBS interview collects continuous information on these items through thrice-yearly interviews. The community component also contains summary components, which update the household enumeration and health insurance status and follow-up on cost and sources of payment information for “open items” from the previous interview.


Continuous data on utilization and expenditures are required for a number of reasons. First, several of the distinct expenditure categories involve relatively rare medical events (inpatient hospital stays, use of home health care, purchase of durable medical equipment), so limiting the reference period would mean insufficient observations for national estimates. Second, episodes of medical care often consist of a series of services over weeks or months; continuous data will allow examination of the grouping of services around particular episodes of care. This is particularly important when a number of medical services are included in a global fee. Third, payment for medical services often occurs considerably later than the utilization, so collection of complete information about a particular event can often only be obtained some time after the event occurs. In addition, this emphasis on utilization and expenditures will formulate an excellent baseline to monitor both Medicare reform and CMS’ program management effectiveness.

The administration of the instruments will continue to follow the established pattern of data collection, i.e., baseline information will be collected in the initial interview. This will be followed in all subsequent interviews with the core component. The core community and facility components are administered in the second interview (January through April) to maintain a consistent reporting period for utilization and expenditure data. Since the initial interview always occurs in the last four months of a calendar year, collection of utilization and expenditure data in the second interview means the reference period will always begin prior to January 1st. This creates use and expenditure estimates on a calendar year basis.


The access, enumeration and demographic series (i.e., baseline information) will be asked and reference dates established in Rounds 49, 52 and 55 for those individuals new to the MCBS. The core components are administered in every round thereafter. For those continuing sample persons, we administer the core questionnaire in addition to the baseline instrument in Rounds 49, 52 and 55.


The literature (initially reported by Neter and Waksberg in 1964, and confirmed in subsequent research by other analysts) indicates that collection of behavioral information in an unbounded recall period can result in large recall errors. A part of the initial interview (Rounds 49, 52 and 55) prepares the respondent for the collection of utilization and expenditure information in subsequent rounds, thus “bounding” the recall period for the next interview. In addition, at the conclusion of the initial interview, the sample person (new rotational sample only) is provided with a calendar. This calendar marks the recall period for the respondent, serves as the means to record utilization, and as a prompt to retain statements and bills.


(ii) Content of the core/continuous questionnaire, Rounds 48-56. We are proposing no change in content in the core questionnaire for Rounds 48-56.


Community Questionnaire.


Introduction and enumeration section. In the initial interview, the MCBS collects information on the household composition, including descriptive data on the household members such as age, gender and relationship to the sample person. We also verify the address and telephone number of the sample person. This information is updated in each subsequent round.


Health insurance. In the initial interview, we collected information on all sources of secondary health insurance, both public and private, which cover the sample person. Included are questions about premium, coverage, primary insured, source of the policy (i.e., private purchase, employer sponsored, etc.) and managed care status. This information is updated in each subsequent round.


Utilization series. This section collects information on the sample person's use of medical services. We specifically probe for use of: dental services, emergency room services, in-patient hospital services, outpatient hospital services, institutional services (skilled nursing home services, intermediate care facility services, etc), home health services, medical provider services (medical doctors, chiropractors, physical therapist, etc.), prescribed medicines and other medical services. For each type of service reported, we collect information on the source of care, type of provider, date that the service was provided, and if medications were prescribed as a part of the event. This episodic information is collected for all services since the date of the last interview.


Charge questions: statement and no statement series. These sections collect information on costs, charges, reimbursements and sources of payment for the health care services reported in the utilization series. If a respondent has an insurance statement (Medicare Summary Notice or private health insurance statement) for a reported medical service, then the statement series is administered. For reported medical utilization, if a respondent indicates that a statement has not been received, but they expect to receive a statement, we defer asking about this service until the statement is received. If the respondent doesn't have and doesn't expect to receive a statement, the no-statement series is asked. Questions are asked about the cost of the services, charges, expected reimbursement, and potential or actual sources of payment (including other family members).


Summary Information. Updates and corrections are collected through the summaries. For the enumeration, insurance and utilization sections, the respondent is handed a hard copy of the information reported or updated in the previous round. The respondent is asked to review this and make any corrections or modifications. For medical events, the respondent is handed a hardcopy of the calendar. This replicates the reporting by month from the previous round and reinforces utilizing a calendar for reporting events. These summary sheets are prepared monthly so that the respondent can rapidly scan the reported events and modify, add or delete episodes of health care. In addition, updates to prescribed medication use can be made at this time.


In addition, information for events that remain open in the previous round (i.e., the respondent expects to receive a statement, but had not received a statement at the time of the last interview), is collected in the charge and payment summary. Information is collected through this summary in a manner that is consistent with the statement or no-statement series (see Attachment V for copies of the summaries).


Facility Questionnaire. The facility component collects information that is similar in content to the household interview. Sections of the institutional instrument parallel the household instrument, i.e., residence history parallels the household enumeration section. The provider probes capture information that is similar to the community utilization section and the institutional charge series parallels the household charge series (statement and no statement series). Differences in the facilities and community components result from differences in the setting of the interview and the types of respondents. The facility questionnaire is administered by the interviewer to one or more proxy respondents designated by the facility director. The household instrument is administered to the sample person or their designated proxy. Both the household and facility interviews are record driven. However, the facility respondents refer to formal medical care records, while in the household, the respondent is dependent their own record keeping. The core facilities instrument contains the following sections:


Residence History. This sections collects continuous information on the residence status of the sample person, including current residence status, discharge and readmission.


Health Services. This section collects information on medical use by type of service. Type of providers and setting used are identified for reported medical events. In addition information is collected on the number of times or volume of care received.


Prescribed Medicines. All medications administered in a facility are prescribed. Information is collected on the name, form, strength, and dispensing frequency of the medication.


Inpatient Hospital Stays. Information is collected on any inpatient hospital stays reported in the Residence History.


Institutional charges. This section collects information from the institutions on the charges, reimbursement levels and sources of payment for the sample person. Information on bad debt and other sources of differences between bills and payments.


(iii) Content of topical supplements. The MCBS interview consists of core items and one or more topical supplements. The content of the supplements is determined by the research needs of CMS, the Department, and other interested agencies, including the Medicare Payment Advisory Commission. Topics for the community component include: income, assets, program knowledge and participation, demographic information, health and functional status, satisfaction with care, and usual source of care. For the facility instrument topical supplements include the eligibility screener and the baseline instrument (contains questions on demographics and income, residential history, health status and functioning, type of housing and health insurance).



2007

Round 47 Round 48 Round 49

Core Interview Core Interview Core interview

Knowledge and Information Needs Income and Assets Overlap Series

Prescription Drug Choice Prescription Drug Awareness Facility: Baseline and Screener


2008

Round 50 Round 51 Round 52

Core Interview Core Interview Core Interview

Knowledge and Information Needs Income and Assets Overlap Series

Prescription Drug Choice Prescription Drug Awareness Facility: Baseline and Screener

Patient Activation

2009

Round 53 Round 54 Round 55

Core Interview Core Interview Core Interview

Knowledge and Information Needs Income and Assets Overlap Series

Prescription Drug Choice Prescription Drug Awareness Facility: Baseline and Screener

──────────────────────────────────────────────────────────────

- Household Core Interview = Household Composition, Health Insurance, and Utilization and Charge Series (statement/no- statement series)

- Facility Core Interview = Residence History, Provider Probes, Prescription Medications, Hospital Stay and Institutional Charges.

- Overlap Series =Access to Care, Satisfaction with Care, Usual Source of Care, Health Status and Functioning, Housing Characteristics,

Demographics and Income.

- Facility Baseline = Demographics and Income, Residence History, Health Status and Functioning, and Health Insurance.

Table C.4: Supplements For Clearance

For the community interview we are requesting clearance to continue to field the Overlap series (i.e. Usual Source of Care, Access to Care, Satisfaction with Care, Health Status and Functioning, Health Insurance, Household Enumeration, Housing Characteristics, Demographics and Income, and Provider Probes), Income and Assets, Knowledge and Information Needs, Prescription Drug (to complement the change in the enrollment period, content will be split between Jan – Apr and May – Aug rounds), and Patient Activation supplements. For the facility interview, we are requesting clearance for the eligibility screener and the baseline instrument.


Table C.4 presents the supplements that we are seeking clearance for at this time. If additional supplements are planned, separate clearance packages will be developed.

C..3.e Rounds 48 through 56 data collection procedures.


(i) Interviews with sample persons in community. In Round 49, 52 and 55 all newly selected sample persons will be sent an advance letter (Attachment IV) from the Centers for Medicare and Medicaid Services. Interviewers will carry copies of the advance letter for sample persons who do not recall receiving one in the mail, as well as a copy of the MCBS brochure and question-and-answer sheet (Attachment IV). This process was and will continue to remain effective.


The household component interview (Rounds 48-56), described in Section C..3.d above, will be administered to the sample person or a proxy using a computer-assisted personal interviewing (CAPI) program on a laptop computer. A hard-copy representation of the continuous core for Rounds 48-56 CAPI interview for persons living in the community is shown in Attachment V. Attachment V includes a copy of the instrument that is administered in the initial interview, the ongoing interview, and the Show Cards, used by the interviewer to assist in the interviewing process.


At the completion of the initial interview i.e., Rounds 49, 52 and 55 interview, each new sample person is given a MCBS calendar (Attachment VI), on which he or she is encouraged to record health care events. The same calendar is provided to all continuous community respondents on a calendar year basis.


(ii) Interviews with sample persons in institutions. All new facility admissions during Rounds 38-46, will be traced to the institution where they reside. For the initial facility interview the Eligibility Screener, Baseline and Core Questionnaires are administered. All facility interviews are administered to facility staff using a CAPI program on a laptop computer. For all facility residents, the facility screener is administered during the Fall of each year (Attachment VII). The facility core institutional questionnaire to be used in Rounds 48-56 is shown in Attachment VIII.


Some administrators will require consent of the sample person or a next of kin before releasing any information. The data collection contractor will offer to obtain such written consent, using the consent form and letter included as Attachment IX.


(iii) Verification Interviews. A brief verification re-interview (Attachment X) will be conducted for 10 percent of the interviews.


C..3.f Proxy rules. For community sample persons, the preferred mode is self-response. During the initial interview (with subsequent updates), sample persons are asked to designate proxy respondents. These are individuals who are knowledgeable about the respondent’s health care and costs and expenditures for this care. In the MCBS, only those individuals who are designated by the sample persons can serve as proxy respondents.

The facility setting presents a different and changing set of circumstances for the MCBS. In the past the MCBS used the policy of making no attempt to directly interview residents in a facility. But, changes in elderly care have interviewers encountering facilities, which provide a wider range of services that fall outside the scope of traditional Medicare certified facilities. In some cases, such as custodial care and assisted living communities, the best person for answering our questions is the beneficiary, rather than facility staff. MCBS interviewers are now trained to determine and seek out the appropriate source for interviewing. While we feel that the majority of facility interviews will continue being conducted with facility staff, having no contact with the beneficiary, there will be cases for self-response in the facility setting. For persons who move in and out of long-term care facilities, standard procedures will be used to determine the best respondent to provide data about the period spent outside of such facilities. Self-response will be used in prisons if permitted. Other institutions will be treated on a case-by-case basis.


C..4 Methods to Maximize Response Rates


MCBS is sampling a heterogeneous population that presents a unique challenge for maximizing response rates. The household survey will be approaching two groups--aged and disabled Medicare beneficiaries‑‑who have characteristics that often lead to refusals on surveys. Increasing age, poor health or poor health of a family member are prevalent reasons for refusal. On the other hand, older persons are the least mobile segment of the population and thus less likely to be lost due to failure to locate. The disabled population tends to have a slightly higher response rate than the aged population. While the percentage of non-response do to death is comparable to that of the 70-74, 75-79 and 80-84 age brackets, refusal rates are the lowest of all age categories.


Because this is a longitudinal survey it is essential that we maximize the response rates. In order to do so, survey staff undertakes an extensive outreach effort annually. This includes the notification of government entities (CMS regional offices and hotline, carriers and fiscal intermediaries, and Social Security Offices), national organizations including the American Association of Retired Persons, the Association for Retarded Citizens and various community groups (e.g., mayor's offices, police, social service and health departments, home health agencies, state advocates for the elderly and area agencies on aging). These efforts are undertaken to increase the likelihood that respondents would answer the MCBS questions and remain in the survey panel by: 1) informing authoritative sources to whom SPs are likely to turn if they suspect the legitimacy of the MCBS; 2) giving interviewers resources to which they can refer to reassure respondents of the legitimacy/importance of the survey; and 3) generally making information about MCBS available through senior centers, other networks to which SPs are likely to belong and through the CMS website.


In addition to the outreach efforts, the following efforts remain in place to maintain a sense of validity and relevance among the survey participants.


  • An advance letter is sent to both potential sample persons and facility administrators from CMS with the Administrator's signature. This includes an informational brochure answering anticipated questions.

  • A handout with Privacy Act information and an appeal to participate is given to the SP at the door by the interviewer.

  • Interviewer training emphasizes the difficulties in communicating with the older population and ways to overcome these difficulties.

  • Individualized non-response letters are sent to SPs who refuse to participate. These letters are used when deemed appropriate by the field management staff. CMS staff follows up with respondents who refused because of concerns about privacy and federal sponsorship of the survey.

  • Proxy respondents are sought for SPs unable to participate for themselves.

  • Non-respondents are re-contacted by a refusal conversion specialist.

  • A toll-free number is available at Westat to answer respondent's questions.

  • An E-mail address and website are available at CMS to answer respondent’s questions.

  • The sample person is paired with the same interviewer throughout the survey. This maintains rapport and establishes continuity of process in the interview.

  • Periodic feedback mechanisms have been established. These include describing the availability of data, types of publications presenting MCBS data and preliminary findings presented in the form of data summaries.

  • We encourage personal touches, including interviewer notes and birthday cards.

  • Personal letters of appreciation have been sent from the Federal Project Officer. These letters include information on recent publications from the MCBS and status of the project. In addition, information on selected supplements (e.g., Income and Assets) has been mailed to sample persons prior to the interview.


In contrast to most surveys, the MCBS has a large amount of information to characterize non-respondents. This information, including Medicare claims data, can be used for imputation if necessary. To minimize the risk of bias from non-response the most up-to-date non-response adjustment techniques are used. Models predicting the propensity not to respond are built based upon the extensive administrative databases available and upon data from earlier rounds. We then use propensity to respond to form cells to adjust respondent weights. Simultaneously, the substantive characteristics of non-respondents will continue to be tracked in the administrative databases to monitor the risk of bias.


C..5 Person responsible for statistical aspects of design


Adam Chu

Senior Statistician

Westat, Inc.

(301) 251-4326


Westat, Inc., of Rockville, Maryland conducts the MCBS.

REFERENCES


Cohen, S.B., and V.L. Burt: “Data Collection Frequency Effect in the National Medical Care Expenditure Survey.” Journal of Economic and Social Measurement, 13: 125-151 (1985).


Cohen, S.B., and B.B.Cohen: “Data Collection Frequency Effect in the National Medical Care Utilization and Expenditure Study.” American Statistical Association, Proceedings of Section in Survey Research Methods (in press, 1985).


Corder, L.S., and K.G. Manton: “National Surveys and the Health and Functioning of the Elderly: The Effects of Design and Content.” Journal of the American Statistical Association, 86, No. 414: 513-525 (1991).


Densen, P.M.: “Tracing the Elderly Through the Health Care System: An Update.” AHCPR Monograph: 282-87-0049, pp.1-36 (1987).


Dulaney, R., Vincent, C., and Rhoads, M.: “The CAPI Survey Management System for the Medicare Current Beneficiary Survey.” Paper presented at the Census Bureau Annual Research Conference. Arlington, VA. March 24, 1992.


Edwards, B., Edwards, S., Gray, N., and Sperry, S.: “CAPI and the Medicare Current Beneficiary Survey: A Report on Round 1.” Paper presented at 1992 conference of the American Association for Public Opinion Research. St. Petersburg Beach, FL. May 18, 1992.


Edwards, S., Sperry, S., and Edwards, B.: “Using CAPI in a Longitudinal Survey: A Report From the Medicare Current Beneficiary Survey, Proceedings of Statistics Canada Symposium 92: Design and Analysis of Longitudinal Surveys.” Statistics Canada. Ottawa, Ontario. November 1992.


Eppig, F. and Chulis, G.: “Matching MCBS and Medicare Data: The Best of Both Worlds.” Health Care Financing Review. Vol. 18, No. 3: pp.211-229 (1997).


O’Sullivan, J., Lee, J., and Yang, B.: “Medicare: The Role of Supplemental Health Insurance.” CRS Report to Congress 96-826EPW, Washington, DC: Congressional Research Service, The Library of Congress. October 10, 1996.


Sperry, S.: “CAPI and the Medicare Current Beneficiary Survey.” Paper presented at the Census Bureau Committee on Computer Assisted Survey Information Collection . Suitland, MD. November 21, 1991.


Waldo, D.R., Sonnefeld, S.T., McKusick, D.R., Arnett, R.H.: “Health Expenditures by Age Group, 1977 and 1987.” Health Care Financing Review Vol. 10, No. 4: 111-120 (1987).

LIST OF ATTACHMENTS


Attachment I: CMS Strategic Action Plan 2006-2009


Attachment II: MCBS Analysis Plan


Attachment III: Federal Register notice

Attachment IV: Advance Letter

MCBS Brochure and Introduction to MCBS Sheet

Confidentiality Agreement


Attachment V: MCBS Household Round 46 Instruments to Include:

Baseline and Core in both English and Spanish

Supplements

Show Cards


Attachment VI: MCBS Calendars for Supplemental and Continuing Samples


Attachment VII: Facility Screening Interview Script for Meeting with Facility

Administrators


Attachment VIII: MCBS Facility Round 46 Instrument to Include:

Baseline / Core


Attachment IX: Consent forms: Resident and Next of Kin Authorization to Obtain

Information from Medical Records


Attachment X: Verification Re-interview


Attachment XI: MCBS: Figure 1: State, County and City


Attachment XII(a): “Overview of the Medicare Current Beneficiary Study Redesign

Evaluation Task”

XII(b): “Redesign of the Medicare Current Beneficiary Survey Sample”


Attachment XIII: Report: “Mis-Reporting of Prescription Drug Utilization and Expenditures

in the MCBS”


Attachment XIV: Report: “Impact of Nonresponse on Medicare Current Beneficiary Survey

Estimates”

1



File Typeapplication/msword
AuthorHCFA Software Control
Last Modified ByCMS
File Modified2006-10-27
File Created2003-10-24

© 2024 OMB.report | Privacy Policy