CMS-10445 Supporting Statement A_02082013

CMS-10445 Supporting Statement A_02082013.docx

Medicare Advantage Quality Bonus Payment Demonstration

OMB: 0938-1195

Document [docx]
Download: docx | pdf

I. SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSIONS


Background


The Centers for Medicare & Medicaid Services (CMS) has contracted with L&M Policy Research (L&M) and its partner, Mathematica Policy Research (Mathematica), to support the Agency in responding to the provision of the Patient Protection and Affordable Care Act (PPACA) which required, beginning in 2012, quality bonus payments (QBPs) to all plans earning four or five stars in Medicare’s Star Rating program. As an extension of this legislation, CMS launched the Medicare Advantage Quality Bonus Payment Demonstration, which accelerates the phase-in of QBPs by extending bonus payments to three-star plans and eliminating the cap on blended county benchmarks that would otherwise limit QBPs. Through this demonstration, CMS seeks to understand how incentive payments impact plan quality across a broader spectrum of plans. CMS has engaged our research team to evaluate this demonstration.


The research team plans to conduct this data collection effort in the form of a survey of Medicare Advantage Organizations (MAOs) and up to 10 case studies with MAOs in order to supplement what can be learned from the research team’s analyses of administrative and financial data for MAOs, and from an environmental and literature scan.


  1. JUSTIFICATION



  1. Need and Legal Basis

This collection is part of the evaluation of the Medicare Advantage Quality Bonus Payment Demonstration, which accelerates the phase-in of QBPs mandated by the PPACA.


This data collection is needed by CMS as part of the evaluation of the QBP demonstration to better understand what impact the demonstration has had on MAO operations and their efforts to improve quality. The data collection instrument for the survey is a survey questionnaire designed to capture information on how MAOs perceive the demonstration and are planning for or implementing changes in quality initiatives and to identify factors that help or hinder the capacity to achieve quality improvement and that influence the decision calculus to make changes. The case studies will be conducted as a series of open-ended discussions with MAO staff that will be guided by a discussion protocol.


  1. Information Users

As a new collection, the information is expected to provide a detailed picture to CMS of the kinds of quality initiatives utilized by MAOs and some preliminary information on how they assess the effectiveness of these programs. The survey is designed to provide an overall picture of the QBP that can be used for national comparisons across plans as part of the larger evaluation of the QBP demonstration. The case studies will supplement the information gathered from the survey and data analysis, providing valuable context and details about successful quality improvement activities. The case studies are particularly well suited to exploring the detailed characteristics of the plans’ quality improvement activities, emphasizing the decision-making and thought processes underlying the structure and direction of their efforts. The case studies will capture the mechanics of the quality improvement programs, as well as the contextual factors that impact the nature, structure, and scope of the programs.

  1. Improved Information Technology

The research team is proposing a multi-mode of data collection for the survey, with a self-administered paper survey mailed to participants, along with telephone prompting for participants who fail to complete and return the mailed questionnaire within the designated time period. During the phone prompt, interviewers will encourage participants to return the questionnaire by mail or fax and will offer to complete the survey over the phone or to send them an electronic version.


The team is choosing a mail methodology over web for three main reasons: the small sample size makes programming and administering a web survey costly and inefficient, the portability of a mail survey is more convenient than web and allows respondents to easily seek input from other MAO staff if needed, and the sample frame does not contain web addresses, which is the most effective way to communicate with respondents when administering a web survey. However, when requested by an MAO, a Microsoft Word version of the survey material will be sent in an email attachment or by fax.

The case studies will be conducted in person at the MAO’s offices. Conducting the research in person aids in building rapport between the research team and the MAO representatives, which can encourage more candid discussion.


  1. Duplication of Similar Information

The survey and case studies will ask MAOs only about information that they have not already reported to CMS and that is not available through the Health Plan Management System (HPMS), the portal through which CMS and MAOs share plan details and much of CMS communications to plans. Specifically, the data collection proposed is intended to focus specifically on MAO perceptions and attitudes toward quality ratings and the QBP demonstration. Because the QBP demonstration has not been previously conducted, the information gathered through the survey and the case studies has not previously been collected.


The focus of the survey instrument and discussion protocols are to ask questions that the team cannot collect through data analyses. By forgoing this data collection, CMS would lose the opportunity to gather insights into the perceptions and operational changes being made by MAOs in response to the QBP demonstration.


  1. Small Businesses

Small businesses or other small entities are neither involved in nor significantly impacted by this program.


  1. Less Frequent Collection

The survey and case studies seek to ask questions that the team cannot collect through other data analyses. By forgoing this data collection, CMS would lose the opportunity to gather insights into the MAOs’ perspectives on the value of the QBP demonstration and its impact on their ability to improve the level of quality of care and services provided to Medicare beneficiaries as a result of the incentive payments offered. The survey will be fielded only once, and the limited case studies will also be conducted only once.


  1. Special Circumstances

There are no special circumstances that would cause the collection of information to be inconsistent with 5 CFR 1320.6.


  1. Federal Register Notice/Outside Consultation


The 60-day Federal Register notice (77 FR 57090) published on September 17, 2012. There were comments received from the public and those comments have been addressed.


The research team has consulted with a number of expert consultants on the work of MAOs from within and outside of CMS for suggestions on the type of information to be collected, the availability of data, and content and wording of the survey instrument. This input will also be used to guide the development of the case study discussion guides.


The survey instrument was pre-tested with up to nine MAO representatives who have knowledge about quality improvement initiatives. The case study discussion guides will be piloted with one MAO and revised as needed before completing the remaining nine case studies.


  1. Payment/Gift to Respondents

There are no plans for payment of any kind to respondents.


  1. Confidentiality

The survey and case studies will collect information about quality improvement initiatives in MAOs. The questions asked will refer to the MAOs and not to individuals. The information collected will be kept secure to the extent permitted by law.

  1. Sensitive Questions

The survey and case studies will also not ask any questions of a sensitive nature to the operation of the MAOs, such as regarding profit margins or other competitive information, and thus do not contain any questions of a sensitive nature.


  1. Burden Estimate (Total Hours and Wages)

Table A1 presents estimates of the response burden.


We estimate the pre-survey initial call will take 10 minutes to complete. This call will briefly introduce the survey, ask for a commitment to return the questionnaire quickly, and collect contact information needed to mail out the survey materials. Based on our pretest, the survey questionnaire will take approximately 55 minutes to complete for plans with one contract. MAOs with multiple contracts will be asked to complete one survey questionnaire for each contract and will be encouraged to complete an electronic version of the survey to make the process easier and quicker. We estimate that after the initial questionnaire is completed, each subsequent electronic survey questionnaire will take approximately 25 minutes to complete. The information requested in the survey is information that is usual and customary for MAO representatives knowledgeable about quality improvement activities. There are no cost burdens as there are no capital and startup costs and no operations/maintenance of services costs to respondents.


In estimating the burden for the case studies, we have assumed that the team will interview between 12 and 18 individuals during each of the 10 site visits, with each interview lasting between one and two hours, depending on the availability and role of the MAO representative. We have used the upper estimates when calculating the burden for Table A.1.


Table A.1

Estimated Annualized Burden Hours

Form Name

Number of Responses

Hour per Response

Annual Hour Burden

MAO Survey Initial Call

550

0.17

92

MAO Mail Survey for Single Contract MAOs

First survey



72



0.92



66

MAO Mail Survey for Multiple Contract MAOs

First survey
Each additional survey



59
309



0.92
0.42



54
130

Case Study Interviews

180

2.00

360


  1. Capital Costs (Maintenance of Capital Costs)

No capital costs will accrue to respondents.


  1. Cost to Federal Government

The estimated cost to the federal government for design, field testing, obtaining approval of the OMB package, data collection, and an analysis of the findings is $610,117.


  1. Program or Burden Changes

Subsequent to the publication of the 60-day Federal Register notice (77 FR 57090), the survey was revised as follows.


The survey was revised to remove questions determined to be duplicative and to add questions either requested during the public comment period or during the pre-test. Minor revisions to some response categories and editorial corrections were also made.


In addition to a paper survey, MAOs now also have the option of submitting an electronic version of the survey. As a result, the burden for MAOs with multiple contracts has decreased. We assume that MAOs with multiple contracts will complete an electronic version of the instrument because, after they have completed the initial survey instrument, they will be able to more quickly complete subsequent surveys.


  1. Publication and Tabulation Dates

The survey will be conducted in 2013 (roughly August through mid October). The following table shows the overall schedule for the survey, including the beginning and ending dates for data collection.


PROPOSED SURVEY SCHEDULE

Activity

Time Frame

Mail letter to MAO CEOs and presidents

June 4 – 6, 2013

Conduct pre-survey screening calls to identify survey respondents

June 6 – July 4, 2013

Mail out advance letter and copy of survey to all respondents

June 6 – July 4, 2013

Send fax, mail, or electronic reminders about survey to all nonrespondents

June 18 – July 18, 2013

Make follow-up calls to nonrespondents

June 18 – July 18, 2013

End data collection

August 12, 2013

Clean Data

August 15 – September 6, 2013

Prepare analytic file and analyze data

September 7 – October 21, 2013

Final interim report

November 29, 2013

Final evaluation report

July 29, 2015


The findings of the plan survey will be reported through an interim report submitted by the contractor to CMS in November 2013. A first and fundamental step in the analysis of survey results will be to examine the variables of interest for normality, identifying those with a skewed distribution and potentially transforming the data (e.g., log form) to impose normality. For continuous variables, this univariate analysis can be conducted with visual inspection of the variables through scatter plot matrixes, box-plots, and other graphical displays. Analysis of outliers will also be an important component of the univariate analysis. In addition, simple frequencies of study variables will be calculated for the population total and by stratum of interest (e.g., plan type, geography, program type) to provide a basic description of the study population. The next step will involve bivariate analyses to examine the relationships among the variables of interest. Again, visual inspection through graphical analyses will be performed to observe the directionality of the relationships.


While this study is largely descriptive, multivariate analyses can be instructive in identifying the magnitudes and likelihoods of relationships between health plan or quality improvement characteristics and outcomes of interest. For example, we can look at which characteristics might be associated with plan quality, payments, and changes in quality over time to help document the demonstration’s impact. The specific models will be developed after receipt of the data. However, a sample multivariate regression model can be summarized by the equation below along with brief examples of data elements.


Yi = 0 + 1’X1’ + 2’X2’ + 3’X3’ + 4’X4’ + i


i = unique health plan (1 to n, where n=1,2,3…sample size)

Yi = Outcome variable for health plan ‘i’ and may include:

  • Participation/attrition %

X1’ = represents the vector of health system/health plan characteristics and may include:

  • Health plan features (size, enrollment, model type/contracting)

  • Geography (region, state)

X2’ = represents the vector of program characteristics and may include:

  • Program orientation (e.g., patient/provider/both)

  • Data system type

  • Mode/frequency of identification approaches

  • Monitoring/Education features

  • Duration

  • Provider support tools

X3’ = represents the vector of target population characteristics and may include:

  • General characteristics of enrollees (e.g., eligibility category, gender, race/ethnicity, dual eligible status, age) – we will pilot test capacity of plans to provide this

i = error term


A comprehensive analytic plan specifying the range of univariate, bivariate, and multivariate analyses to be conducted will be submitted by the contractor to the CMS Project Officer for review following preliminary data inspection, but prior to formal data analysis. We anticipate that this will include prevalence estimates of health plan and quality improvement characteristics. Assuming an 80% response rate for the survey, we anticipate approximately 440 observations will be available for analysis. This sample is sufficient for robust multivariate analyses using the entire sample. However, the team will take care when conducting any stratified analyses to ensure that the sample size is still sufficient. However, these determinations will be made once the data are available and the distribution of observations across the strata of interest can be assessed. Depending on the distribution of the survey responses, we would explore the following types of research questions:

  • Were there significant quality improvements under the demonstration?

  • Was there improvement across the ratings continuum?

  • Which quality domains and measures showed the greatest improvement?

  • Did quality continue to improve throughout the three years of the demonstration?

  • What types of contracts showed the most improvement?

  • Did certain types of contracts show more improvements in selected quality domains?

  • How much money in QBPs was paid under the demonstration? How was the money distributed among MA organizations?

  • Does enrollment increase in organizations with improving quality?

  • What quality domains appear to be associated with changes in enrollment patterns?


The final report will include updated sections from the interim and case study reports, a detailed presentation of synthesized results by the Aims and research questions, and conclusions. The final report will be submitted to CMS in July 2015.


PROPOSED CASE STUDY SCHEDULE

Activity

Time Frame

Draft discussion guides

July 31, 2013

Final discussion guides

August 30, 2013

Finalize MAOs for case studies

January 31, 2014

Recruit and schedule interviews

February 1 – February 28, 2014

Conduct site visits

March 1 through April 30, 2014

Analyze data

May 1 through May 30, 2014

Draft case study report

May 30, 2014

Final case study report

July 31, 2014


While onsite, the team will enter interview notes into a customized Access database to facilitate analysis. Following each site visit, the team will review the information housed in the Access database, in addition to other materials collected from the organization, and individually identify themes and findings by aim and question category. Areas to be explored in greater detail with the case study sites will include questions about organizational decisions regarding investment in quality improvement activities such as:

  • What has been MAOs’ experience with quality ratings and quality improvement activities thus far? In which areas (domains and individual measures) do MAOs believe they have the greatest ability to affect improvements? Why?

  • How have MAOs decided on their investments to improve quality, and to what extent are these activities in response to the demonstration and the expected associated payoff? What has been their investment of time and money specifically to improve quality in response to the demonstration?

  • How have they weighed level of investment versus financial return under the demonstration? How much investment do organizations consider necessary to produce sufficient quality improvement to increase a plan rating by one star? Are these efforts part of long-term strategies or focused on short-term results?

The team will then synthesize findings from the case studies and submit the draft Case Study Report by May 30, 2014, finalizing the report by July 31, 2014.


  1. Expiration Date

The OMB expiration date will be displayed on the mail questionnaire, on the letters, and on any advance material sent to respondents.


  1. Certification Statement

The data collection will conform to all provisions of the Paperwork Reduction Act and CMS does not request any exemptions from the certification statement.


  1. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

    1. Respondent Universe and Sampling Methods

After removing any plans that are considered ineligible (such as 1876 cost plans or those not currently active), it is anticipated that approximately 550 plans will comprise the survey sample. We plan to survey all eligible MA plans and so will not sample from this population. The mail survey will be completed by about 440 (or 80 percent) of these plans. Contact information for MA plans will come from CMS’s contract and plan contact databases maintained in the HPMS.


For the case studies, the research team will select up to 10 MAOs, using data from the HPMS, the MAO survey, and the environmental scan, to reflect diversity across the following types of characteristics:

  • Contract type

  • Basic plan structure (tightly managed group model HMO vs. looser HMO or PPO)

  • Geography and payment cohort

  • Beneficiary population

  • Star rating/quality scores, especially improvement in ratings

  • Corporate ownership


    1. Procedures for the Collection of Information

The survey will be conducted with all MA plans under contract with CMS in 2012. In order to identify a knowledgeable respondent and gain commitment to participate from the MA contract representative/survey respondent, letters will be mailed to the president or CEO of each sampled health plan (see Appendix B). The letter will explain the purpose of the study and request that a survey respondent be designated. A contact name, phone number, and email address will be included in the letter to send back the survey respondent information. The ideal respondent will be a senior staff member who is most knowledgeable about the quality improvement activities conducted by the MA plan. Telephone interviewers will then contact the person to briefly introduce the survey, ask for a commitment to return the questionnaire quickly, and collect contact information needed to mail out the survey materials (see Appendix C). Each MA plan respondent will then receive a cover letter on CMS stationary, a questionnaire, a fact sheet containing detailed information about the study, and a postage-paid return envelope. This material will be sent via FedEx because of the overnight delivery feature and the guarantee that the material will reach the correct person. When requested by a survey respondent, a Microsoft Word version of the survey will be sent in an email attachment or by fax. Shortly after questionnaire delivery, respondents will be called to confirm that the materials have been received, respond to any questions or concerns, and emphasize the need for a timely response (see Appendix D). Two weeks after the initial mailing, we will make telephone calls to all nonrespondents to encourage them to complete and return the mail questionnaire. We will also give them the option of completing the survey by telephone. Reminder calls will continue weekly throughout the field period.

For the case studies, the team will begin outreach to the selected MAOs using a recruitment package that includes a description of the project, site visit process, types of staff members we would like to speak with, range of proposed dates, and time commitment required. We also propose to include an “endorsement” letter – a brief explanation of the project on CMS letterhead encouraging participation and assuring that participation does not impact their contracts. As appropriate, the team will also enlist the support of expert/stakeholder interviewees to lend further credence to research activities in the eyes of those identified for the case studies. Finally, it will be important to convey to the organizations that contract-specific data will only be shared in an aggregate format and that the team will take steps to mask the feedback of the participating organizations. The cover letter will address these issues and convey the willingness of the team to work with the plan to address any other privacy/confidentiality issues they may have.


Unless an appropriate contact has already been identified, the recruitment package will be sent to the president or CEO of the health plan. We will then follow up with the staff member to whom we are referred to schedule specific dates and times for site visits, depending on the availability of the MAO’s representatives.


The site visit team will consist of three project team members (two senior staff and one junior staff). Under the direction of the task leader, a research associate will work with each site to schedule the interviews at a convenient time. Each site visit will include interviews with key administrators such as chief executives, operating officers, financial, medical, and quality improvement officers, and other key employees leading performance improvement efforts connected to quality scores. By meeting with multiple people within the case study sites, the site visit team will be able to better understand the environment within which each site is operating and their history in terms of quality improvement and star ratings. We will schedule three-day site visits to ensure flexibility in accommodating the schedules of the various health plan staff members that will be interviewed, and to minimize the need for follow-up telephone calls post site visit.


    1. Methods to Maximize Response Rates and Deal with Nonresponse

Response rates in the mail survey of MA plans will be maximized in a number of ways. We will mail introductory letters on CMS stationery and follow up with telephone calls to determine whether selected MA plans are eligible for the survey. During these calls, we will also identify the person most knowledgeable about the MA plan; we will then mail the survey to the designated survey respondent. The cover letter, which will be personalized and printed on CMS letterhead, will include contact information for the CMS project officer and a toll-free number at Mathematica to call for questions or to request a copy of the questionnaire. The letter will describe the evaluation and the purpose of the mail survey, and will provide instructions. It will indicate that the survey is voluntary and give the estimated time for completion.


Follow-up telephone calls by trained interviewers (during which plans can complete the survey) extend our strategy. We will send the initial questionnaire by mail and place a follow-up call if the respondent has not returned the questionnaire within two weeks. We will continue to place follow-up calls weekly throughout the field period to encourage response. We will offer multiple-mode options (mail, telephone, email, and fax). Interviewer training materials will emphasize tips for dealing with nonresponse. The questionnaire is relatively short and has only a few open-ended response categories. There are clear instructions on the first page. We considered making the survey available on the web but concluded that it was not cost effective to expend resources to program a web survey instrument when we expect to complete surveys with roughly 440 MA plans.


The response rate for the mail survey will be calculated as the number of MA plans that complete the questionnaire (either by mail or by telephone) divided by the total number of MA plans that were deemed eligible and mailed a questionnaire (all unique MA plans). Because we know the universe of approved, unique MA plans or contracts, the denominator of the response rate does not include ineligible plans or plans whose eligibility is unknown. Response rate calculations are based on standards established by the American Association for Public Opinion Research.


Based on previous surveys with similar populations, we anticipate achieving a minimum response rate of 80 percent on the survey. For non-respondents, we will construct a profile based on characteristics of the plans drawn from the HPMS and data collected through the pre-survey screening outreach.


To encourage participation in the case studies, the team will begin outreach to the selected MAOs using a recruitment package that includes a description of the project, site visit process, types of staff members we would like to speak with, range of proposed dates, and time commitment required. We also propose to include an “endorsement” letter – a brief explanation of the project on CMS letterhead encouraging participation and assuring that participation does not impact their contracts. As appropriate, the team will also enlist the support of expert/stakeholder interviewees to lend further credence to research activities in the eyes of those identified for the case studies. Finally, it will be important to convey to the organizations that contract-specific data will only be shared in an aggregate format and that the team will take steps to mask the feedback of the participating organizations. The cover letter will address these issues and convey the willingness of the team to work with the plan to address any other privacy/confidentiality issues they may have.


To facilitate MAO participation, the research team will travel to the MAO’s offices to conduct the interviews and schedule the interviews for dates and times that are convenient for the MAO representatives.


    1. Tests of Procedures or Methods to Be Undertaken

Up to nine MA plans will be selected to pretest the survey instrument. Plans that have a range of quality scores will be selected for the pretest so that the entire questionnaire can be efficiently tested. The pretest will identify any items that are burdensome or difficult to respond to, and these items will be removed or revised accordingly. An average response time will be estimated from the pretest surveys.


    1. People Consulted on Statistical Aspects, and People Collecting or Analyzing Data

The following people have contributed to the questionnaire content and design, or to statistical or analytic features of the study:


Gerald Riley, Project Officer

Centers for Medicare & Medicaid Services

(410) 786-6699

[email protected]


Sarah Gaillot

Centers for Medicare & Medicaid Services

(410) 786-4637

[email protected]


Dr. Lisa Green, Project Director

L&M Policy Research

(240) 476-6663

[email protected]


Kelly Moriarty, Deputy Project Director

L&M Policy Research

(310) 428-7953

[email protected]


Julia Ann Doherty

L&M Policy Research

(202) 688-2936

[email protected]


Rachel Dolin

L&M Policy Research

(202) 688-2936

[email protected]


Martha Kovac, Survey Director

Mathematica Policy Research

(609) 275-2331

[email protected]


Eric Grau, Statistician

Mathematica Policy Research

(609) 945-3330

[email protected]


Lauren Maul, Survey Specialist

Mathematica Policy Research

(312) 994-1041

[email protected]



APPENDIX A: MAIL SURVEY

APPENDIX B: ADVANCE LETTER

APPENDIX C: INITIAL CALL FORM

APPENDIX D: REMINDER CALL SCRIPT

APPENDIX E: FREQUENTLY ASKED QUESTIONS


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLMaul
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy