Attachment 5 - Data Analysis

ATTACHMENT 5 - DATA ANALYSIS.docm

Medicare Contractor Provider Satisfaction Survey (MCPSS) and Supporting Regulations in 42 CFR 421.120 and 421.122

ATTACHMENT 5 - DATA ANALYSIS

OMB: 0938-0915

Document [docx]
Download: docx | pdf

















ATTACHMENT 5



ANALYSIS OF DATA FOR DETERMINING A NEW DEFINITION

OF A COMPLETED SURVEY





Proposal to define Core Items for the MCPSS


Methodology to Determine Core Items

The determination of core items were restricted to sections A (provider inquiries) and C (Claims Processing). Conceptually sections A and C constitute the core functions of the contractor. Consistent with this idea, the quantitative analysis which examined the contribution of particular sections to overall satisfaction identified A and C as being the most important. The other sections were less important with respect to overall satisfaction, as indicated by their low correlation with the overall satisfaction item (pp. of the 2007 Analytic Report).


Once restricting attention to the items in sections A and C, a series of stepwise regressions were computed for each of the four contractor types (Carriers, FIs, RHHIs and DMEs). The dependent variable for each regression was the measure of overall satisfaction and the explanatory variables were the items in sections A and C. In addition, several non-questionnaire items were included as possible explanatory variables. The non-questionnaire variables were: number of facilities, jurisdiction, length of time in medicare and type of provider. The stepwise regression used a “forward” selection process. At the first step, it selected the variable that explained the most variance in global satisfaction. It then selected the next most powerful predictor of global satisfaction. It continued this process until there were no more variables that were significant at least at the .001 significance level.


The significant explanatory variables from these regressions were then used to assess the implications of using them to define a completed interview. This consisted of using the 2007 data to estimate the number of interviews that would no longer be counted as a complete under different definitions.



Stepwise Regressions


Table 1 provides the questionnaire items, in order of importance, along with the contribution that variable makes to the R-Square for the regression. The R-Square represents the cumulative proportion of variance that is explained by the independent variables. For example, when variable A7 is used to explain satisfaction among Carrier providers, the R-Square is 44.7%. When both A7 and C1 are entered into the equation, the R-Square increases to 50.5%.


Shape1


The pattern across all of the contractor types is similar. The first few variables entered into the regression explain the most variance. As new variables are added, the increase in the proportion of variance explained becomes very small. For example, for Carriers, the first three variables (A7, C1 and A5) explain 53.7% of the variance. Adding up to 7 other variables adds a total of 2.9% to the total variance explained.


Table 2 provides the ranking of the variables with respect to the proportion of variance explained. Generally speaking, there are slightly more variables from section A than from section C. The most important variables across the different types of contractors are similar. Item A7 (“Your Contractor’s ability to fully resolve problems without you having to make multiple inquiries”) is the first variable selected in the regressions for three of the four contractor types. Item C1 (“The accuracy of your Contractor’s claims editing”) is the second or third variable included across the four types. The other variables are not as consistently selected across all types of contractors. Item A4 (“The effort your Contractor makes to make the Provider Inquiries process as easy as possible for you”) and Item A5 (“The modes of communication that are offered by your Contractor to exchange information with them about Inquiries”) are in the top three for FIs/RHHIs and Carriers/DMEs, respectively. One item only appears in the top three for RHHI, which is A3 (“The consistency of responses that you get from different Provider Inquiries representatives”).


Shape2


These rankings are slightly different from those discussed in the 10 December 2007 memo. The prior analysis was based on the bivariate correlations between total satisfaction and each of the items. The cutoff for defining an “important” item was also set at a relatively low level (mid-point). With respect to section A, the prior analysis found item A7 and A4 as the most important across the different contractors. For section C, items C1 and C5 were ranked as the most important items. However, item C5 (“The availability of your Contractor’s representatives to address claims-related issues”) was not found to be important in the stepwise regression.


For purposes of defining core items, we propose the top three items from the stepwise regressions be included:


DME and Carriers: A7, C1 and A5

FI: A7, A4 and C1

RHHI: A4, C1 and A3


Adding other variables will increase the number of calls that will have to be made to get a completed interview and complicate the interviewers task when conducting interviews.


Implications for Data Collection

Using the definition of the core items given above, it is possible to assess a revised definition of “core” would have on the number of completed interviews. Table 3 provides the results from the 2007 survey by the amount of missing data for the core items for each of the contractor types. In total there were 833 respondents that have missing data for at least one of the three core items. This is approximately 4.6% of the total number of completes collected in 2007. Slightly less than half (n=402) of these are missing because the respondent didn’t know or refused to answer. The vast majority of these are respondents not knowing the answer. The refusal rate is neglible (less than 1% of this category). A similar number of individuals didn’t answer because it was not applicable (n=387). Examples of an inapplicable response may be that the respondent did not know of any problems during the reference year (see item A7) or did not have experience with the contractor editing claims during the reference period (see item C1).


Shape3


Our proposal defines a completed interview when all three of the core items are administered (i.e., not skipped). Under this scenario, there would have been forty-four additional interviews that would not be counted as a complete in 2007. We do not recommend requiring non-missing data on all core items because it might jeopardize use of data from other items of the questionnaire. For example, the vast majority of those that are missing at least one core item did provide satisfaction data on at least 4 other items in sections A and C.


In conjunction with this definition of the core, we recommend that the interviewing procedures be changed to put more emphasis on collection of the core items. If a respondent answers don’t know, refuse or skips one of these items, there would be a follow-up screen that would re-ask the question. The question would emphasize the importance this item has for evaluating the contractor. No such follow-up would be made if the respondent reports the item is not applicable.





3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy