CMS Response to SPIA Comments 10-10-07

CMS Response to SPIA Comments 10-10-07.doc

Medicaid State Program Integrity (SPIA) CMS-10244

CMS Response to SPIA Comments 10-10-07

OMB: 0938-1033

Document [doc]
Download: doc | pdf

CMS Response to Comments

Medicaid State Program Integrity Assessment (SPIA) System Federal Register Notice

Published July 27, 2007




Comment #1:

Some of the draft questions in the data collection instrument are vague and difficult to interpret. There appears to be some difficulty in distinguishing Program Integrity functions from other State Medicaid oversight administrative functions that also help protect the “integrity” of the Medicaid program. Each State’s Medicaid program is unique and organized in a different manner, and there can be multiple activities under the umbrella of “Program Integrity” that vary widely from state to state.


CMS Response:

CMS understands that before collecting these data on a national level, the definitions for the questions in the data collection instrument need to be clearly defined to ensure that States understand what information to report to CMS. To facilitate this, CMS has developed a comprehensive glossary of terms and definitions to accompany the SPIA data collection instrument. See Appendix C for a draft of the Medicaid Integrity Program (MIP) glossary.




Comment #2:

Develop a glossary of terms for the data collection instrument so that States can consistently interpret the questions.


CMS Response:

See response to Comment #1.




Comment #3:

Pilot test the survey first with a small group of states.


CMS Response:

In early 2007, CMS conducted a case study pilot with nine states to aid in the design and development of an approach to the national SPIA system. As part of the pilot, CMS developed a data collection instrument that was tested with the 9 case study states. Based on the results of the pilot and feedback from the case study states and MIP Advisory Committee, CMS made extensive revisions to the data collection instrument. CMS has also continued to revise the survey instrument collaboratively with States during the development of SPIA performance measures over the past few months. In addition, CMS views the first national data collection for the SPIA as a “pilot” and will make revisions to the instrument and data collection procedures based on lessons learned.







Comment #4:

Also work with States to establish a clear, consistent, and fair methodology for determining return-on-investment and other outcome measures.


CMS Response:

As part of our continued work with States and the MIP Advisory Committee, CMS convened a return-on-investment (ROI) subcommittee and a performance measure subcommittee to provide input and feedback during the development of the draft SPIA performance measures, including ROI. The subcommittee members have been actively involved in the development and refinement of the measures.




Comment #5:

Ask the States what kinds of assistance or support from CMS would most enhance State Medicaid Integrity efforts (for example, improved notification from the OIG about excluded providers).


CMS Response:

CMS has included a question on the SPIA data collection instrument that asks States to identify three areas in which CMS can provide technical assistance and support (See Appendix B, question 55).




Comment #6:

Ensure that States have enough time to incorporate program integrity outcome data into their case management and reporting systems; otherwise the estimated burden on the States would be much heavier than envisioned by the legislation.


CMS Response:

All of the data elements for the SPIA data collection have been identified through a collaborative effort with State program integrity officials.  The vast majority of this information should be easily collectible. We encourage States to automate as much of this reporting as feasible.  However, in the absence of such a system, we do not believe that collecting it will be an unreasonable burden.  Prior to automating any information collection, States should be aware that these data elements will evolve as CMS learns more from the States through its initial collection process.










Comment #7:

Question 7: Estimate of Expenditures for Medicaid Integrity activities ($)


Without clear direction as to what kinds of costs States should include in this figure, CMS will have no assurance that the information is valid and comparable across States. This is especially important since the expenditures reported here might be used in calculating States’ return on investment (ROI) for Program Integrity activities. Accordingly, CMS should request more detailed information on Program Integrity expenditures:

  • Payroll

  • Benefits

  • Operating

  • Contractual Costs (dedicated PI contracts as well as pro-rated share of general agency contracts)

  • IT Costs

  • Indirect Cost Rate

In addition, this question makes no distinction between expenditures for a distinct Program Integrity unit/division versus any type of “integrity” function within the Medicaid agency. As a basis for this data collection instrument, therefore, CMS should use a narrowly focused definition of Program Integrity that should be common across all States. For numerical and financial data collection, we request that CMS limit the definition to the Program Integrity functions to those required under 42 CFR 455 Subpart A, Medicaid Agency Fraud Detection and Investigation Program; 42 CFR 456.23, Post-Payment Review Process; and 42 CFR 1002, Program Integrity – State Initiated Exclusions from Medicaid.



CMS Response:

CMS’ approach for the SPIA data collection is to start with a high level of information in order to minimize burden on the States. We will reassess the level of detail in the questions based on the information we receive after the first data collection. CMS agrees with the need for a focused definition of program integrity that will be common across all States and will provide guidance as part of the instructions for the data collection. We have revised the question to reflect the categories for Medicaid Integrity activities used throughout the instrument (see Appendix C, question 10).








Comment #8

Question 8: Organizational Structure for Medicaid Integrity Activities within State


The three choices presented seem to encompass all possible combinations of Program Integrity structures. However, a comment field should be added to this question to allow States to better describe their organization and what the State considers to be its Medicaid Integrity department or unit.


CMS Response:

The SPIA data collection instrument includes a question that addresses the activities that the State includes under Medicaid integrity and an “Other, please specify” response option to provide additional information (See Appendix B, question 9).




Comment #9

Question 9: Activities that the State includes under the Scope of Program Integrity


These terms are vague and too broad, and will mean different things to different people. At best this question will provide only some very general information. This question does not recognize the fact that activities CMS views as “program integrity” can be performed by the Program Integrity department or unit and most likely will also be performed by other parts of the State Medicaid agency, such as an audit division, IT department, etc. Finally, the list of choices for this question does not include anything related to beneficiary or recipient fraud and abuse, which can be an integral part of the Medicaid Integrity function. These same concerns also apply to question 10.


CMS Response:

CMS has included definitions for the terms in this question in the draft MIP glossary (see Appendix C). CMS understands the variation in State program integrity models and that responses will vary based on the model identified by the State. The question is intended to provide CMS with descriptive information on State program integrity activities. Beneficiary or recipient fraud is not included under the purview of the Medicaid Integrity Program.









Comment #10:

Question 12: Total number of FTEs by type of position (e.g., Auditor, SURS Analyst) for all functions considered to be Medicaid Integrity.


The value of this information is unclear, and it may be difficult for States to answer this because every State has a different way of naming its positions. Does CMS just want information for the distinct Medicaid Integrity unit or department, or does it want this information for any activity, such as audits of provider compliance with program and contractual requirements, that the State considers to be a Medicaid integrity, audit, or oversight function? Also, if CMS wishes to use this question as a measure of the resources the State allocates to the program integrity function, then it might be better to ask how many of the state’s Program Integrity FTEs are skilled medical professionals or have a certification such as CPA, CRE, etc. Also, States should distinguish between filled and vacant FTEs.



CMS Response:

CMS is asking States to report the total number of FTEs for ALL functions that the State considers to be Medicaid Integrity, regardless of the program integrity model in the State. We have revised the question to reflect the categories for Medicaid Integrity activities used throughout the instrument (see Appendix C, question 10) and will reassess the question based on the information we receive from the first data collection. We have also revised the question to distinguish between filled and vacant FTEs.




Comment #11:

Question #13: Inventory of IT resources used to conduct Medicaid Integrity activities


The description of choices should be more detailed and specific. It is not clear what the differences are between “SURS I”, “SURS II”, etc. If the State’s decision support system is also certified as the agency’s SURS system, should the State check both “SURS” and “decision support system”?


CMS Response:

CMS has included definitions for the terms in the draft MIP glossary (see Appendix C). The question indicates “check all that apply”; therefore if the State’s decision support system is also certified as the agency’s SURS system, the State should check both “SURS” and “decision support system”.





Comment #12:

Question 14: Estimated Expenditures for IT Resources


Should States include some pro-rata share of the MMIS in this calculation? Also, is there a provision to insure that this amount should also be reflected in the State’s Medicaid Integrity expenditures as completed in response to question 7? CMS should ensure that the same question is not asked twice within the survey.


CMS Response:

CMS agrees and has deleted the question from the data collection instrument.

The estimate of expenditures for Medicaid Integrity activities should not include some pro-rata share of the MMIS.



Comment #13:

Question 15: Does the State have a documented strategic plan to address Medicaid Integrity?


Some States may interpret this to mean the State Medicaid Plan. Also, currently, there is not a specific requirement that States have a strategic plan to address Medicaid Integrity. Is this question intended to identify best practices in State Medicaid Integrity programs?


CMS Response:

CMS includes a definition for strategic plan in the draft MIP glossary (see Appendix C). The question is intended to provide CMS with descriptive information on State program integrity practices.




Comment #14:

Question 23: Does the State include language in its MCO contracts specifying Medicaid Integrity requirements?


Certain contract language, that includes provisions for monitoring and oversight of the MCOs, is required, but not all States will see this as a program integrity integrity function, and therefore might not answer this question accurately.


CMS Response:

CMS agrees and will provide guidance in the instructions for the data collection instrument.








Comment #15:

Question 25: Does the State include TPL recoveries as part of its Medicaid Integrity return on investment?


Since some States do include TPL recoveries in their ROI calculations, and some States do not, including this question in the data collection instrument opens to the door to inconsistent reporting by the States. It should be one way or the other, and the survey should make that clear. Also, it is difficult to see how TPL activities, which should be a routine part of a claims adjudication system, would fall under the auspices of Program Integrity.


CMS Response:

CMS is not including TPL as part of its definition of Medicaid Integrity. The question is intended to provide CMS with baseline information on whether or not States include TPL in their ROI calculations.



Comment #16:

Question 27: Does the State include prior authorization as part of its Medicaid Integrity return on investment?


If some States include prior authorization in their ROI calculation, and some States do not, including this question in the data collection instrument opens the door to inconsistent reporting by the States. It should be done one way or the other, and the survey should make that clear.


CMS Response:

CMS is not including prior authorization as part of its definition of Medicaid Integrity. The question is intended to provide CMS with baseline information on whether or not States include prior authorization in their ROI calculations.




Comment #17:

Question 29: Does the State have written policies regarding issues including, but not limited to:…How to secure evidence in a legally admissible form?


This choice in Question 29 is puzzling. Does CMS mean that Program Integrity has policies and procedures to ensure that it collects and maintains case review documentation in such a manner that it can assure that it can support overpayment in the advent of an appeal, and that any evidence turned over to the MFCU is valid and could be used by the MFCU during the course of its investigation? Or, is CMS asking about data security? Or both? Also Program Integrity units present case information in an administrative hearing; MFCUs go to court – so who is responsible for ensuring evidence is secured in a legally admissible form?

CMS Response:

The question is intended to determine whether States have written policies and procedures to ensure that Program Integrity staff collect and maintain case review documentation to support their identification of overpayments in the advent of an appeal and/or MFCU investigation.




Comment #18

Question 30: Data mining techniques used to detect Medicaid fraud, waste & abuse or inappropriate payments (list & describe)


The meaning of this question is unclear. Does CMS want a listing of applications or software, or the titles of fraud algorithms? “Data mining techniques” is a very broad and general term, and can encompass anything from simple spreadsheet functions, such as sorting, to complex proprietary algorithms requiring programming skills. It is unclear as to the value of this question in evaluating State Medicaid Integrity programs. It might be more valuable to ask what percent of cases are generated through data mining techniques as opposed to complaints or “tips”.


CMS Response:

The question is intended to provide CMS with descriptive information on State program integrity practices. We have revised the question to include categories of data mining techniques and included additional questions on data repository platforms and data mining analysis tools (See Appendix B, questions 28-30). We have also included a question to determine the percentage of cases opened as a result of data mining activities (See Appendix B, question 31a)




Comment #19:

Question 31: Overpayments ($) identified through data mining techniques


There is no simple way to answer this question. Data mining techniques are used to identify aberrant or overpayment trends among providers and/or claims, and often lead to a desk review or on-site review. Therefore, which action, the data mining or the actual review, “identified” the overpayments? Again, we suggest re-phrasing the question to ask what percent of cases where recoupments were identified were opened as a result of data mining (as opposed to a complaint or a tip).


CMS Response:

CMS worked collaboratively with a number of States in the development of the data collection instrument. Several States indicated they can report this level of information. We have revised the question to include percentage of cases opened as a result of data mining activities (see Appendix B, question 31a) and will re-visit this question based on the responses from the first year of data collection.

Comment #20:

Question #32: Overpayments ($) recovered through data mining techniques:


This question cannot be answered as phrased. Data mining techniques can only help identify potential overpayments; they do not “recover” overpayments. Staff actions subsequent to the data mining recover overpayments.


CMS Response:

CMS agrees and has revised the question to read: “Overpayments ($) recovered as a result of data mining activities”.




Comment #21:

Question 33: Does the State typically extrapolate overpayments?


There appears to be a built-in bias toward extrapolation in this question. Instead, a range of choices should be offered (i.e., frequently, sometimes, depending on the type of review, etc.). In addition, if CMS finds that the State’s program integrity methodology is important for its assessment, then it should ask for information on other techniques used to determine overpayments (e.g., judgmental samples, line-by-line reviews, algorithms, etc.)


CMS Response:

The question is intended to provide CMS with descriptive information on State program integrity practices. We will reassess the level of detail in the question based on the results of the first year data collection.




Comment #22:

Questions 34 & 36: Overpayments ($) identified and overpayments ($) recovered through…Cost Report audits


Cost report audits should not be confused with or lumped together with program integrity provider reviews. Cost report audits are generally performed by auditors or CPAs in accordance with government auditing standards, for the purpose of validating reported costs that may affect future provider rates or DSH. Differences in costs versus the rates paid are cost settled, either for or against the provider. This is different from the type of overpayments identified by program integrity reviews, which generally stem from non-compliance with policy and/or fraud and abuse.


CMS Response:

The question is intended to provide CMS with baseline information on State program integrity activities. We will reassess the question based on the results of the first year data collection.


Comment #23:

Question 44: Does the State impose provider sanctions…Number of providers that the State suspended payment


Does “suspended payments” also mean withholding payments as provided for under 42 CFR § 455.23?


CMS Response:

CMS defines suspension of payments as “the withholding of payment by a State Medicaid Agency to a provider or supplier before a determination of the amount of the overpayment exists” (See draft MIP glossary in Appendix C).




Comment #24a:

Questions 43-49: Performance Measures


Unless CMS establishes the methodology to be used by States in estimating cost avoidance and return on investment, it cannot ensure that each State’s performance is measured in a similar manner. Cost avoidance should be the measure of a change in a provider’s billing patterns as a result of a Program Integrity review. However how should this be calculated? Over how long a time period (one year, two years, or more) should cost avoidance be claimed? What is the methodology for measuring cost avoidance of providers who have left or been excluded from the Medicaid programs?...If CMS wishes to collect information on cost avoidance there must be a common methodology for all States to use.


CMS Response:

CMS has been working collaboratively with a number of States to develop the methodology for the cost avoidance and return on investment performance measures. In the short-term, we plan to determine whether States calculate cost avoidance dollars due to administrative actions against providers, payment system changes, and policy changes. We will continue to work with States to develop a methodology to calculate these measures in the future.




Comment #24b:

The measure or “percentage of providers with identified overpayments” may also be of limited valued. What does this really tell us about the Medicaid integrity program for the State? Will there be a benchmark that CMS wants States to meet?


CMS Response:

CMS agrees and has deleted this question from the SPIA data collection instrument.


Comment #24c:

Certainly States should be able to claim cost avoidance due to system or policy changes, but calculating this can be a complex task, and covers more areas than Program Integrity. How does one attribute system changes to “Program Integrity” as opposed to some area of the State Medicaid agency? Indeed, Program Integrity should not be making payment policy. Therefore, systems and policy changes, such as new edits to enforce CCI, for example, might be recommended by the Program Integrity division or unit but should only be initiated by policy staff. This performance measure is actually measuring something that State Program Integrity units have no control or authority over.


CMS Response:

See response to Comment #24a.




Comment #24d:

A return on investment may be the single most important performance measure, but again, it will be useless unless CMS standardizes the methodology for determining both the “investment” and the “return”. For example, what should the “return” be measured on – state funds, federal funds, or both?...What kinds of recoveries should be included: provider reviews, paybacks from beneficiaries, third party liability recoveries, estate recoveries, results from quality assessment reviews, penalties resulting from contract compliance audits, hospital admissions denied by the QIO – the list can vary widely from State to State. Calculating return on investment brings us back to the definition of Program Integrity. The more concise and limited this definition is, the more consistent and reliable will be CMS’ performance measures for the Medicaid State Program Integrity assessments.


CMS Response:

CMS agrees and is continuing to assess this issue for future data collections. We will not collect State return on investment information in the first SPIA data collection.







October 10, 2007 Page 11 of 11

File Typeapplication/msword
File TitleCMS Response to Comments
AuthorG Eva Tetteyfio
Last Modified ByCMS
File Modified2007-10-17
File Created2007-10-17

© 2024 OMB.report | Privacy Policy