MEDICARE CURRENT BENEFICIARY SURVEY (MCBS) Fall 2019 Advance Letter Experiment ReportHHSM-500-2014-00035I, HHSM-500-T0002 |
February 7, 2020 |
Presented to: William Long, Contracting Officer’s Representative Centers for Medicare and Medicaid Services 7111 Security Boulevard Baltimore, MD 21244 |
Presented by: Susan Schechter, Project Director NORC at the University of Chicago 55 East Monroe Street 30th Floor Chicago, IL 60603 |
Fall 2019 Advance Letter Experiment Report 1
Material Development and Approval 3
Training Field Staff to Conduct Experiment 4
Appendix A: Experimental Letter 8
Appendix C: Advance Letter Questions in the Interviewer Remarks Questionnaire (IRQ) 10
The Medicare Current Beneficiary Survey (MCBS) is a longitudinal panel, multi-purpose survey of a nationally representative sample of the Medicare population, conducted by the Centers for Medicare & Medicaid Services (CMS) through a contract with NORC at the University of Chicago. CMS received approval under Generic Clearance 0938-1275 on December 3, 2018 to conduct a split-ballot experiment in Fall 2019 to test the revision of the advance letter, referred to as the experimental letter, against the original advance letter, referred to as the control letter. Performance of each experimental group was analyzed to determine if changing the letter had any impact, and if so, which letter should be used moving forward. The experimental letter group had a significantly higher completion rate and the revised letter was preferred by field interviewers. All other metrics showed no significant difference between groups at the end of data collection. Based on these findings, CMS recommends using the revised experimental letter in future data collection rounds.
The Medicare Current Beneficiary Survey (MCBS) was launched in 1991 and is a continuously fielded, face-to-face survey of a nationally representative sample of the Medicare population conducted by CMS through a contract with NORC at the University of Chicago. The Medicare population includes all Medicare eligible persons aged 65 and over, and persons under age 65 with certain disabilities or with end-stage renal disease (ESRD). The MCBS uses a rotating panel design and collects data from Medicare beneficiaries up to eleven times over a span of four years. Incoming panels are sampled and recruited in the fall of each year to replace the panel that rotates out in the winter.
Advance letters are used to legitimize and increase respondent awareness of a forthcoming survey. The MCBS advance letter is mailed to incoming panel sample members prior to the start of data collection each fall round. This mailing also includes a brochure to address frequently asked questions regarding participation and provide results from previous rounds of the survey. NORC and CMS staff explored various ways to improve these mail materials in order to update them and also to improve response rates. The resulting recommendations for the advance letter included simplifying letter text, highlighting the request for respondent participation, reducing extraneous information and agency references unfamiliar to the target audience, and formatting updates to modernize the letter.
CMS collaborated with NORC to implement these changes ahead of Fall 2019 with the goal of bringing the letter, which was minimally revised in 2014, in line with best practices in the field of survey research and respondent communication. An experiment was implemented to test the performance of sample receiving the new, experimental letter against sample receiving the original, control letter. This report summarizes the results of this advance letter experiment and offers recommendations on how to proceed with future implementation.
The aim of this experiment was to assess whether the new advance letter design is more effective in promoting survey participation. There are many factors impacting participation and cooperation, so analysis focused on several key metrics, including survey completion rates and a measure of respondent recall of the letter. Survey paradata were analyzed to determine if there were any significant differences in these metrics between the incoming panel respondents who received the experimental advance letter and those who receive the control letter. To expand upon these findings, qualitative feedback from field interviewers was also collected. These data were used to assess other aspects of response to the letter that may not be captured in quantitative survey data or paradata.
This letter experiment was conducted as a split ballot design for incoming panel sample released in July 20191. Cases included in the experiment were flagged as part of the routine sampling process used to select the 2019 Incoming Panel. Half the sample was designated the experimental group set to receive the newly designed letter and the other half was designated the control group to receive the original version of the letter. These groups were assigned randomly. Letters were mailed to all sampled beneficiaries approximately two business days before the field period began on July 22, 2019. No other changes to the questionnaire or interviewing protocol are associated with this experiment.
The sampling frame for each new incoming panel begins with Medicare administrative enrollment data. To avoid duplication in the various panels of MCBS beneficiaries, a unique and disjoint 5-percent sample of the enrollment data is specified annually by CMS for the MCBS. Sampling of the incoming panel for Fall 2019 was conducted per usual practices, as it was unaffected by this experiment. Once the sample for the first incoming panel case release was selected, each case was randomly assigned to the control or experimental groups. A flag variable was added to these case data called “Experiment Flag”, with a value of 0 representing assignment to the control group and a value of 1 representing assignment to the experimental group.
The MCBS sample was stratified by the fourteen-level stratum variable, which represents the cross of seven-level age group and Hispanic status. The sample was also sorted by primary sampling unit (PSU) and secondary sampling unit (SSU) before random selection of experimental flag cases, which gave additional control over the distribution of the sample by PSU and SSU. Beneficiaries were then selected by systematic random sampling within strata at a rate of 50 percent, which corresponds to assignment for the experimental flag. Balance of the experimental flag across the dimensions of stratum, sex, PSU, and region was verified.
OMB approval to conduct the experiment was granted in December 2018, in advance of fielding of the experimental letter in July 2019.
Changes made in the experimental letter include streamlined and re-ordered text, larger font size, increased emphasis on the purpose of the letter in the first paragraphs, and increased whitespace. The experimental letter more clearly states that an interviewer will visit the home and ask to conduct an interview. Unfamiliar acronyms and jargon were removed to make the letter easier for a general audience to understand. The Department of Health and Human Services logo was chosen to replace the CMS logo in the header, as this more closely matches the mailing envelope, as well as other correspondence beneficiaries receive related to Medicare. The final versions of the experimental and control letters can be seen in Appendixes A and B, respectively.
Advance letters are mailed to sampled beneficiaries ahead of the release of these cases to field interviewers but generic versions of the advance letters are provided to field staff to jog respondent memory and gain cooperation in-person. As such, field interviewers were notified of the experiment and received training regarding both letter types in regular remote training ahead of the Fall 2019 data collection period. To facilitate the experiment in the field, a flag in the case management system indicated whether each respondent received the experimental or control letter. Interviewers were provided with copies of each version of the letter to show to respondents as part of their introduction to the survey. At the conclusion of each completed interview, two questions in the Interviewer Remarks Questionnaire (completed by the interviewer, not the respondent) asked the interviewer to assess whether the respondent recalled the advance letter, and whether the interviewer presented the letter (see Appendix C). No other changes were made to interviewer protocol.
NORC provided a print vendor both versions the letter and instructions indicating which cases should receive which letter, using the experimental flag created by the sampling team. A quality assurance process was implemented to ensure the correct letter version was mailed to each respondent as prescribed by the flag. Two business days before the start of data collection, 11,158 letters were mailed, as presented in Exhibit 1. A small subset of cases were sent bilingual letters including both English and Spanish letters based on CMS delivered indicators of language preference. Out of the letters mailed, 199 control letters and 198 experimental letters were mailed in English and Spanish, comprising about 4 percent of each experimental group. Data collection began on Monday July 22, 2019 and ended on December 31, 2019.
Exhibit 1. Type of letters mailed by language
|
Control Letter |
Experimental Letter |
English |
5,398 |
5,363 |
Bilingual |
199 |
198 |
Total |
5,597 |
5,561 |
As noted above, there are many factors impacting respondent cooperation and participation in a field study like MCBS. This analysis focuses on those factors most likely to be influenced by the advance letter received. Results presented here utilize final case status and other paradata metrics collected as of the end of the data collection period.
Definitions of each metric are detailed below. A completed case was defined as a case with Community interview data through the final section of the interview and a final disposition of Complete for the Fall 2019 data collection round. Refusals were defined as any case with at least one refusal contact disposition recorded within the record of calls during the fall round, regardless of its final case status. These include both “hard” and “soft” refusals2. Success at the first in-person contact with the respondent is defined as an outcome of an appointment or completed interview component upon the first in-person contact (outcomes like “Not Home” or “Call Back” are considered unsuccessful). The count of contacts per case includes all in-person and phone contact attempts, whether successful or unsuccessful. Any informational or operational records of call that do not indicate interviewer effort to contact are excluded in contact counts3. Finally, data from the Interviewer Remarks Questionnaire (IRQ) assess respondent recall of the advance letter, although these data are only available for respondents who completed the full interview, limiting its utility to these types of respondents.
Bivariate analyses, comparing the experiment to the control group, were conducted for each of the metrics outline above. Chi-squared tests were used to determine statistical significance, with p values below 0.05 considered significant. All data were tracked both weekly and in total once data collection was completed.
Overall, a higher proportion of cases receiving the experimental letter completed the survey: 54 percent of cases in the experimental group compared to 52 percent in the control group, a statistically significant difference. This suggests that there may be some qualities of the experimental letter that encourage respondent participation.
Refusal rates, success of first in-person contact, and contact attempts per case were all similar across experiment groups. This suggests the revised advance letter did not have any negative impacts on respondent cooperation or participation.
A final measurement is advance letter recall. After the Community interview is completed, field interviewers also complete the Interviewer Remarks Questionnaire (IRQ), which included a question to assess advance letter recall. Among respondents with completed interviews, 62 percent of those in the experimental group recalled the letter, compared to 49 percent of those in the control group, a statistically significant difference. Again, it is important to note the limitations of this measure: it is recorded only for completed interviews, and it is the field interviewer’s assessment, rather than a direct question to respondents.
To get a more complete understanding of letter performance and response in the field, interviewers were also asked to provide information on their perceived view of the experiment and their respondent letter preference.
Overall, 55 percent of interviewers expressed a personal preference for the experimental letter, compared to 26 percent who preferred the control letter. They also expressed greater preference for the new letter for the purposes of gaining cooperation, with 53 percent preferring the experimental letter, as seen in Exhibit 2. For other purposes, such as addressing respondent concerns, most interviewers did not note a preference between the two letters. Overall, responses from field interviewers indicated a positive view of the experimental letter.
Exhibit 2. Field interviewer letter preference
The primary goal of this experiment was to determine whether the experimental letter, with revisions to make the text clearer and easier to understand, is more effective in encouraging respondent participation in the MCBS Community interview. A secondary goal was to determine whether the experimental letter has any negative implications for respondent cooperation. Looking at a range of key performance metrics, the experimental letters are indeed associated with somewhat higher completion rates and may be easier for respondents to recall. Some level of caution is warranted in interpreting these results, however. Advance letters are mailed at the beginning of the data collection round, and interviews may be completed any time within the data collection period. Given the number of variables associated with field interview projects, from interviewer effects to regional differences, it is possible the differences are due to other causes.
The findings above present a case for adopting the experimental advance letter exclusively starting with Fall 2020 data collection and beyond. The letter is more closely aligned with best practices regarding advance mailings, with clearer language to better enable field interviewers to gain respondent cooperation, and is preferred by field interviewers. NORC thus recommends that CMS (and OMB) approve sending the experimental letter to all respondents in future data collection rounds. No edits to the letter appear to be necessary at this time. In addition, this experiment has paved the way for additional experiments to further maximize the impact of this advance mailing. In Fall 2020, an experiment has been proposed to redesign the brochure and test a frequently asked question document as an alternative to the brochure. This ‘Phase 2” test will be submitted for OMB clearance in a forthcoming Generic Clearance request.
July 22, 2019
[FIRST NAME] [LAST NAME]
[ADDRESS]
[CITY], [STATE] [ZIP]
Dear [Mr./Ms.] [LAST NAME]:
Within the next few weeks, a representative of our agency will be coming to your home to ask permission to interview you about your experiences receiving Medicare services. The representative will ask to talk with you for about an hour during that visit or at another time that would be more convenient.
The Centers for Medicare & Medicaid Services is conducting this study to better understand the experiences of people with Medicare. The best way to gather this information is by hearing directly from people with Medicare.
We have selected you as part of a sample of people with Medicare from across the United States that can give us an accurate picture of how well people’s health care needs are being met.
Your participation in the study is your choice. Your Medicare benefits cannot be affected in any way by your decision to participate or the answers you provide, and your information will be kept private to the extent permitted by law, as prescribed by the Federal Privacy Act of 1974.
The representative who will contact you is from NORC at the University of Chicago, the research institution collecting this information for us. This person will have identification showing they are a representative for this Medicare survey.
If you have any questions, please call NORC toll-free at 1-877-389-3429, or email [email protected]. If it would be more convenient for you to set up an appointment for your interview, please call or email us. The enclosed brochure has more information about why we are conducting this study. You can also visit the study website at mcbs.norc.org.
I hope you’ll be able to help us with this important project to improve Medicare services.
/s/
Debra Reed-Gillette, Director
Medicare Current Beneficiary Survey
Centers for Medicare & Medicaid Services
July 22, 2019
Dear [FIRST NAME] [LAST NAME]:
The Centers for Medicare and Medicaid Services (CMS), part of the U.S. Department of Health & Human Services, would like you to help us better understand the needs of Americans enrolled in Medicare.
Since 1991, we have conducted an important study called the Medicare Current Beneficiary Survey. For the last 25 years, this survey has been the nation’s primary source of information about how Medicare affects the people it serves. Because we cannot interview everyone on Medicare, we selected a sample of enrollees to represent all of those on Medicare. You have been selected as a result of a scientific process that ensures all beneficiaries are represented in the survey.
In order to get an accurate picture of the needs of the Medicare population, we need to meet with all selected beneficiaries. Through this study, we gather important information that cannot be obtained in any other way. A few of the topics we would like to discuss with you include: your access to health care services, your use of health care services, the rising cost of care, and your satisfaction with the care you received.
As a participant in the study, you will represent thousands of other people similar to you. All of your information will be kept private to the extent permitted by law, as prescribed by The Federal Privacy Act of 1974. Your participation is voluntary. Let me emphasize that your Medicare benefits cannot be affected in any way by the answers that you provide, or by whether or not you choose to participate. I hope you will decide to join us in this important study.
NORC at the University of Chicago, a respected social science research organization, has been contracted to conduct the study. A professional interviewer will contact you in person or by phone to setup a visit. If you agree to participate in the study, the interview will take about one hour.
If you have any questions about the study or would like to schedule an appointment to participate in this important study, please call NORC toll-free at 1-877-389-3429, or email NORC at [email protected]. Enclosed is a brochure that provides you with more information about the survey. You can also visit the MCBS respondent website at www.mcbs.norc.org to learn more.
The Medicare Current Beneficiary Survey is important to the future of Medicare. Please help us in this national effort to improve your Medicare program.
Sincerely,
/s/
Debra Reed-Gillette
Director, Medicare Current Beneficiary Survey
Centers for Medicare and Medicaid Services
An Advance letter about this study was sent directly to the Beneficiary prior to first contact. Did the Beneficiary remember seeing the Advance Letter?
Yes
No
Don’t Know
Did you mention the Advance Letter to the Beneficiary as a gaining cooperation tool?
Yes
No
Don’t Know
1 The cases considered were limited to Incoming Panel released in the first wave of Community component cases. There are two additional smaller case releases later in the fall round that were excluded from the experiment.
2 A “hard” refusal includes dispositions such as ‘Hostile Refusal’, which generally result in the case being dropped from the survey. A “soft” refusal includes the more generic “Refusal” disposition, which indicates hesitation to participate. Interviewers generally continue to work to try to convert soft refusals.
3 Examples of records of call excluded as non-contacts include updates such as “Locating Results-Batch”, “Supervisor Review”, and “Comment”. All of which indicate that the sample member was never reached.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |