Medicare Part C and Part D Data Validation (42 CFR 422.516(g) and 423.514(g))

Medicare Part C and Part D Data Validation (42 CFR 422.516(g) and 423.514(g)) (CMS-10305)

Appendix 4 DV Procedure Manual_Version 7_10272016

Medicare Part C and Part D Data Validation (42 CFR 422.516(g) and 423.514(g))

OMB: 0938-1115

Document [docx]
Download: docx | pdf

Appendix 4: Medicare Part C and Part D Reporting Requirements Data Validation

Procedure Manual

Version 7.0

Prepared by:

Centers for Medicare & Medicaid Services

Center for Medicare

Medicare Drug Benefit and C&D Data Group


Last Updated: January 23, 2021




Table of Contents



1. INTRODUCTION 2

1.1. Data Validation Requirement 2

1.2. Data Validation Scope 2

1.3. Types of Organizations Required to Undergo Data Validation 2

1.4. Requirement to Use This Manual and Tools 3

1.5. Organization of the Procedure Manual 3

2. PART C AND PART D REPORTING SECTIONS REQUIRING DATA VALIDATION 2017-2019 5

2.1. Part C and Part D Reporting Sections Requiring Data Validation in 2017 5

2.2 Part C and Part D Reporting Sections Requiring Data Validation in 2018 6

2.3 Part C and Part D Reporting Sections Requiring Data Validation in 2019 7

2.4 Reporting Requirements Excluded from the Validation Requirement at This Time 8

3 PLANNING FOR DATA VALIDATION ACTIVITIES 9

3.1 Select appropriate Reviewer based on Standards for Selecting a Data Validation Contractor 9

3.1.1 Standards for Selecting a Data Validation Contractor 9

3.1.2 Timing of Data Validation Contractor Selection 10

3.1.3 Requesting a Contractor Change Mid-Review 10

3.2 Notify CMS of DVC Selection / Request Access to Health Plan Management System (HPMS) Plan Reporting Data Validation Module (PRDVM) 10

3.2.1 Documentation of Data Validation Contractor Selection Process 10

3.2.2 Request Access to HPMS Plan Reporting Data Validation Module 10

3.3 Complete the Web-based Data Validation Training 11

3.4 Review all Data Validation Documents 12

3.4.1 Introduction to the Data Validation Standards 12

3.4.2 Data Validation Standards and Reporting Section Criteria 12

[NAME OF REPORTING SECTION] 13

GRIEVANCES (PART C) 13

3.4.3 Reporting Section Criteria 15

4 PERFORMING DATA VALIDATION ACTIVITIES 17

4.1 Complete Organizational Assessment Instrument (OAI) and Provide Appropriate Documentation to Selected DVC per the OAI’s Documentation Request 17

4.2 Analyze OAI Responses 18

4.2.1. Perform OAI Gap Analysis 18

4.2.2. Review Source Code and Other Documentation 18

4.2.3. Prepare Interview Discussion Guide 18

4.3 Prepare for Site Visit 19

4.3.1 Select Dates and Appropriate Location(s) of Site Visit 19

4.3.2 Develop Agenda for Site Visit 19

4.3.3 Prepare for Data Extraction and Sampling 19

4.4 Conduct Site Visit 20

4.4.1 Conduct Entrance Conference 20

4.4.2 Conduct Interviews with Organization Staff 20

4.4.3 Observe Reporting Processes 20

4.4.4 Extract Census or Sample Data 21

4.4.5 Conduct Exit Conference 22

4.5 Request Additional Documents (If Required) 22

5 ANALYZING RESULTS AND SUBMISSION OF FINDINGS 22

5.1 Determine Compliance with Data Validation Standards and Record Findings in Excel Version of the Findings Data Collection Form (FDCF) to Upload into the HPMS PRDVM 22

5.1.1. Reporting Findings for Standards Using Binary Scale 22

5.1.2. Reporting Findings for Standards Using Likert Scale 23

5.1.3. Review of the Findings Data Collection Form 24

5.1.4. Guidance for Interpreting Standards and Making a Findings Determination 31

Standard 1 31

5.2 Provide Draft Findings to Sponsoring Organization 39

5.3 Review Draft Findings with Sponsoring Organization and Obtain Additional 39

Documentation Necessary to Resolve Issues 39

5.4. Submit Data Validation Review Findings via HPMS PRDVM 39

5.4.1. Data Validation Contractor’s Submission of Findings 39

5.4.2. Sponsoring Organization Disagreement with Findings 40

6 POST- DATA VALIDATION ACTIVITIES 40

6.1 Compile Archive of Data Validation Work Papers 40

6.2 Receive Pass or Not Pass Threshold Level and Assess Pass or Not Pass 41

6.2.1 Pass/Not Pass Determination 41

6.2.2 CMS Notification to Sponsoring Organization of Pass/Not Pass Determinations 41

6.3 Sponsoring Organization Appeal of Data Validation Determination (If Applicable) 41



List of Exhibits



Exhibit 1. Reporting Requirements Data Validation Procedure Manual Revision History 1

Exhibit 2. Data Validation Program Phases 3

Exhibit 3. Data Validation Program Activities 4

Exhibit 4. Part C and Part D Reporting Sections Requiring Data Validation in 2017 5

Exhibit 5. Part C and Part D Reporting Sections Requiring Data Validation in 2018 6

Exhibit 6. Part C and Part D Reporting Sections Requiring Data Validation in 2018 7

Exhibit 7. Part C and Part D Reporting Sections Excluded From Data Validation 8

Exhibit 8. General Instructions for Data Validation Standards 13

Exhibit 9. Example “Note to Reviewer “in Data Validation Standards 13

Exhibit 10. Standard 1: Required Data Fields Are Accurately Captured and Properly Documented 13

Exhibit 11. Standard 2: Data Elements Are Accurately Identified, Processed, and Calculated 14

Exhibit 12. Standard 3: Data Submission 14

Exhibit 13. Standards 4 and 5: Data System Updates and Archive/Restoration 15

Exhibit 14. Standards 6 and 7: Data System Changes and Oversight of Delegated Entity Reporting 15

Exhibit 15. Example Reporting Section Criteria for Appropriate Reporting Period, Reporting Level, and Reporting Deadline 15

Exhibit 16. Reporting Section Criterion for Defining Key Terms 16

Exhibit 17. Reporting Section Criteria for Selected Part C Grievances Data Elements 16

Exhibit 18 Examples of Calculations to Determine Minimum Threshold of Correct Sample/Census Records for “Yes” Finding 23

Exhibit 19 Example of How to Determine Minimum Threshold of Implemented Policies or Procedures for “Yes” Finding 23

Exhibit 20 Example of How to Determine Minimum Threshold of Implemented Policies or Procedures for “Yes” Finding 24

Exhibit 21 Example Rows from FDCF for Standard 1 25

Exhibit 22 example of the FDCF for Standard 2, Sub-Standards 2.a through 2.d. for the Part D Grievances reporting section 26

Exhibit 23 Example Rows from FDCF for Standard 2, Sub-Standard 2.e RSC-6 for Part D Grievances Reporting Section* 28

Exhibit 24 Example Rows from FDCF for Standard 3 for Part D Grievances Reporting Section 28

Exhibit 25 Example Rows from FDCF for Standards 4 through 7 31

Exhibit 26 Guidance for Standard 1 31

Exhibit 27 Guidance for Standard 2 33

Exhibit 28 Guidance for Standard 3 36

Exhibit 29 GUIDANCE FOR STANDARD 4 36

Exhibit 30 Guidance for Standard 5 37

Exhibit 31 Guidance for Standard 6 38

Exhibit 32 Guidance for Standard 7 39

Exhibit 33 Minimum Documentation Required For Data Validation Archive 41




Appendices



Appendix A: Standards for Selecting a Data Validation Contractor

Appendix B: Part C and Part D Reporting Section Data Validation Standards

Appendix C: Model Language for Letter to Confirm Selection of Data Validation Contractor

Appendix D: Example Application for Access to CMS Computer Systems

Appendix E: Organizational Assessment Instrument

Appendix F: Interview Discussion Guide

Appendix G: Example Site Visit Agenda

Appendix H: Data Extraction and Sampling Instructions

Appendix I: Example Data File Inventory Log

Appendix J: Findings Data Collection Form

Appendix K: Data Validation Pass/Not Pass Determination Methodology

Appendix L: Toolkit for universes for sponsor data validation should be used by the reviewer when validating plan data. The toolkit provides a guide on which data elements to identify from SO data, to validate data submitted in HPMS for this reporting section.

Appendix M: Acronym

Exhibit 1. Reporting Requirements Data Validation Procedure Manual Revision History

Version Number

Date

Description of Change

1.0

December 2010

Baseline Release for March-May 2011 data validation

2.0

December 2011

Updated Release for April-June 2012 data validation

3.0

January 2013

Updated Release for April-June 2013 data validation

4.0

March 2013

Updated Release for April-June 2014 data validation

5.0

February 2015

Updated Release for April-June 2015 data validation

6.0

December 2015

Updated Release for April-June 2016 data validation

7.0


Updated Release for April-June 2017 data validation

  1. INTRODUCTION

    1. Data Validation Requirement

The Centers for Medicare & Medicaid Services (CMS) requires that organizations (sponsoring organizations) (SOs) contracted to offer Medicare Part C and/or Part D benefits be subject to an independent yearly review to validate data reported to CMS per the Medicare Part C and Part D Reporting Requirement Technical Specifications (Technical Specifications).1 The purpose of the independent data validation (DV) is to ensure that Part C and Part D SOs are reporting health and drug plan data that are reliable, valid, complete, comparable, and timely.

The validated data improves reporting and provides CMS with assurance that data are credible and consistently collected and reported by Part C and Part D SOs. CMS uses these reported data to respond to inquiries from Congress, oversight agencies, and the public about an SO’s performance using indicators such as operations, costs, availability and use of services, provider network adequacy, and grievance rates. The validated data also allow CMS to more effectively monitor and compare the performance of SOs over time. These data may be used for Star Ratings, and other performance measures. Additionally, SOs can take advantage of the DV process to more effectively assess their own performance and make improvements to their internal data, systems, and reporting processes.

The primary purpose of this Procedure Manual (Manual) is to provide SOs and the data validation contractors (DVCs) (reviewers) they select to perform the DV with information regarding the Part C and Part D Reporting Requirements Data Validation program. The Manual provides background information and an overview of the DV program, discusses the scope and timeframe required for the DV, and describes the tools and processes used for conducting the DV.

All revisions to the reporting section criteria since the April – June 2016 DV cycle are identified by underlined and/or strikethrough text.


    1. Data Validation Scope

CMS requires that the annual, retrospective DV be conducted once per year. For the 2017 DV cycle, the DV will take place during the April 1, 2017 – June 30, 2017 timeframe and will incorporate all data submitted to CMS by March 31st based on the previous calendar years’ reporting requirements. Any data submitted or re-submitted by an SO after March 31 cannot be used for purposes of the DV. The reviewer must submit findings from the annual DV review to CMS by June 30, 2017.

The DV reviews will continue to be conducted at the contract level. CMS believes the contract is the most appropriate unit of analysis in conducting this DV, given that the Part C/D data are generally available at the contract level and that the contract is the basis of any legal and accountability issues concerning the rendering of services.


    1. Types of Organizations Required to Undergo Data Validation


All Part C and Part D SOs that report Part C and/or Part D data to CMS per the Technical Specifications, regardless of enrollment size, are required to undergo an annual DV review.

The only SOs exempt from participating in the data validation program are:

  • Program of All-Inclusive Care for the Elderly (PACE) SOs and Part C Health Care Prepayment Plans.

  • An SO that terminates its contract(s) to offer Medicare Part C and/or Part D benefits, or that is subject to a CMS termination of its contract(s), is not required to undergo a DV review for the final contract year’s reported data. Similarly, for reporting sections that are reported at the plan benefit package (PBP) level, PBPs that terminate are not required to undergo a DV review for the final year’s reported data.

Any SO that delegates the data collection, calculation, and/or reporting for any reporting section or data element to a Pharmacy Benefit Manager (PBM) or other type of delegated entity must have the reviewer it hires include the data and reporting processes for which the PBM/delegated entity is responsible in its DV review for each applicable contract. For example, all entities are required to provide applicable policies, procedures, and source data to the reviewer for validation if they submit data to an SO that is used for any reporting section.


    1. Requirement to Use This Manual and Tools


CMS requires that SOs and their selected DV reviewers use the processes and tools contained in this Manual and its appendices to conduct the annual DV. This includes each of the following documents:

  1. Standards for Selecting a Data Validation Contractor (Appendix A)

  2. Data Validation Standards (Appendix B)

  3. Model Language for Letter to Confirm Selection of Data Validation Reviewer (Appendix C)

  4. Example Application for Access to CMS Computer Systems (Appendix D)

  5. Organizational Assessment Instrument (OAI) (Appendix E)

  6. Interview Discussion Guide (IDG) (Appendix F)

  7. Example Site Visit Agenda (Appendix G)

  8. Data Extraction and Sampling Instructions (Appendix H)

  9. Example Data File Inventory Log (Appendix I)

  10. Findings Data Collection Form (FDCF) (Appendix J)

  11. Pass/ Not Pass Determination Methodology (Appendix K)

  12. Appendix L: Toolkit for universes for sponsor data validation should be used by the reviewer when validating plan data. The toolkit provides a guide on which data elements to identify from SO data, to validate data submitted in HPMS for this reporting section.

  13. Appendix M: Acronym

The Data Validation Standards (Standards) and other documentation associated with the implementation of the DV program assess an SO’s information systems capabilities and overall processes for collecting, storing, compiling, and reporting the required Part C and Part D reporting sections. CMS expects to establish consistency in the DV program by requiring that all entities use appropriate tools and follow the same process.

In order to ensure that the DV documentation can incorporate periodic clarifications to the Part C and Part D Reporting Requirements Technical Specifications, CMS intends to update this Manual and the DV tools contained in its appendices annually no later than February 28 of each year. CMS will post the most current version publicly at: Medicare Part C and D Data Validation and within the Health Plan Management System (HPMS) Plan Reporting Data Validation Module (PRDVM). Prior to beginning each annual DV, it is the responsibility of all SOs and DV reviewers to confirm that they are using the most recent DV documentation available on the CMS DV website.

In the event of a conflict between the Technical Specifications and the Data Validation Standards reporting section criteria, the Data Validation Standards supersede the Technical Specifications. Reviewers must use the Data Validation Standards reporting section criteria to determine DV findings. CMS will take a conflict between the Technical Specifications and the Data Validation Standards into consideration when evaluating the results of the DV review.


    1. Organization of the Procedure Manual


Exhibit 2 below illustrates how the Manual is organized. The document’s content is structured according to the four phases that comprise the DV process. The graphic presents the phases in the order in which the annual DV cycle is conducted.


Exhibit 2. Data Validation Program Phases


Planning for DV Activities => Performing DV Activities=> Analyzing Results and Submission of Findings => Completing Post-DV Activities


Each phase of the DV review process contains several activities. Exhibit 3 displays the activities in the order in which they are found in the document and the order in which they are conducted, beginning with the selection of an appropriate reviewer and ending with the appeal of DV determinations. The DV review process largely entails a collaborative effort between the SO and its independent, external reviewer in terms of information sharing up to the point of the reviewer’s final submission of DV review findings to CMS. Each of these steps is described in more detail throughout the Manual.


Exhibit 3. Data Validation Program Activities


DV Phase

Step

Responsible Party

DV Activities

Timeline*




Planning for

DV Activities

1

SO

Select appropriate DVC based on Standards for Selecting a Data Validation Contractor

December*-March

2

DVC, SO

Notify CMS of DVC Selection / Request Access to Health Plan Management System

(HPMS) Plan Reporting Data Validation Module (PRDVM)


January-April

3

DVC, SO

Complete the web-based Data Validation Training

February-March

4

DVC, SO

Review all DV documents

January-March

Performing DV Activities

5

SO

Complete Organizational Assessment Instrument (OAI) and provide appropriate documentation to selected reviewer per the OAI’s documentation request



March 1 - April 1

6

DVC, SO

Analyze OAI Responses

March 1 or later

7

DVC, SO

Prepare for site visit (site visit agenda, resource needs, and logistics)

Early April

8

DVC, SO

Conduct on-site review (convene entrance conference, conduct interviews with SO staff, observe SO’s reporting processes, and obtain census and/or sample files)

Early April (allow for up to 1 week)

9

DVC

Request additional documents following site visit (if applicable)


Mid/Late April




Analyzing Results and

Submission of Findings

10

DVC

Determine compliance with Data Validation Standards and record findings in Excel-version of the Findings Data Collection Form (FDCF)


June

11

DVC

Provide draft findings to SO

June

12

DVC, SO

Review draft findings and obtain additional documentation necessary to resolve issues


June

13

DVC

Submit findings to CMS via HPMS PRDVM and receive DV scores

No Later than June 30



Completing Post- Activities

15

DVC, SO

Compile archive of DV work papers

July 31

16

SO

Receive Pass or Not Pass threshold level and assess Pass or Not Pass determination based on final DV scores


Summer/Fall

17

SO

Appeal DV determination(s) (if applicable)

Within 5 days of receiving threshold level from CMS.

* References to December refer to the calendar year before the DV review; all other references to months refer to the same calendar year as the DV review.

  1. PART C AND PART D REPORTING SECTIONS REQUIRING DATA VALIDATION 2017-2019



This section provides an overview of the Part C and Part D reporting sections that will undergo validation over the next three years. The DV reporting section criteria that are included in the DV Standards are mapped specifically to these reporting sections.


    1. Part C and Part D Reporting Sections Requiring Data Validation in 2017



CMS modified the reporting deadlines for the calendar year 2016 Part C and Part D Reporting Requirements Technical Specifications so that all affected reporting affected reporting sections are reported to CMS prior to the start of the DV timeframe, and all validations of calendar year 2016 data will be completed in 2017. This modification was completed in 2016. The 2017 DV includes the 8 Part C and Part D reporting sections included in Exhibit 4.



Exhibit 4. Part C and Part D Reporting Sections Requiring Data Validation in 2017



2016 Reporting Section


Reporting Period(s)

Data Submission Due

Date(s) to CMS

DV Findings Due to

CMS*

Part C

Grievances

1/1/16 - 3/31/16

4/1/16- 6/30/16

7/1/16 - 9/30/16

10/1/16 - 12/31/16

2/6/17

6/30/17

Organization Determinations/ Reconsiderations

1/1/16 - 3/31/16

4/1/16 - 6/30/16

7/1/16 - 9/30/16

10/1/16 - 12/31/16

2/27/17

6/30/17

Sponsor Oversight of Agents - C

1/1/16 – 12/31/16

2/6/17,

6/30/17

Special Needs Plans (SNPs) Care Management

1/1/16 - 12/31/16

2/27/17

6/30/17

Part D

Medication Therapy Management Programs

1/1/16 - 12/31/16

2/6/17

6/30/17

Grievances

1/1/16 - 3/31/16

4/1/16 - 6/30/16

7/1/16- 9/30/16

10/1/16 - 12/31/16

2/6/17

6/30/17

Coverage Determinations and Redeterminations

1/1/16 - 3/31/16

4/1/16 - 6/30/16

7/1/16 - 9/30/16

10/1/16 - 12/31/16


2/27/17

6/30/17

Sponsor Oversight of Agents - D

1/1/16 – 12/31/16

2/6/17

6/30/17


2.2 Part C and Part D Reporting Sections Requiring Data Validation in 2018



Eight Part C and Part D reporting sections are included in Exhibit 5. Please note that all 8 reporting sections require a reporting deadline of 2/26/18 or before.


Exhibit 5. Part C and Part D Reporting Sections Requiring Data Validation in 2018


2017 Reporting Section


Reporting Period(s)

Data Submission Due

Date(s) to CMS

DV Findings Due to

CMS*

Part C

Grievances

1/1/17- 3/31/17

4/1/17 - 6/30/17

7/1/1 - 9/30/17

10/1/17 - 12/31/17

First Monday of February in 2018

6/30/18

Organization Determinations/ Reconsiderations

1/1/17- 3/31/17

4/1/17 - 6/30/17

7/1/17 - 9/30/17

10/1/17 - 12/31/17

Last Monday of February in 2018

6/30/18

Sponsor Oversight of Agents - C

1/1/17 - 12/31/17

First Monday of February in 2018

6/30/18

Special Needs Plans (SNPs) Care

Management

1/1/17 - 12/31/17

Last Monday of February in 2018

6/30/18

Part D

Medication Therapy Management Programs

1/1/17 - 12/31/17

First Monday of February in 2018

6/30/18

Grievances

1/1/17- 3/31/17

4/1/17 - 6/30/17

7/1/17 - 9/30/17

10/1/1 - 12/31/17

First Monday of February in 2018

6/30/18

Coverage Determinations and Redeterminations

1/1/17- 3/31/17

4/1/17 - 6/30/17

7/1/17 - 9/30/17

10/1/17 - 12/31/17

Last Monday of February in 2018

6/30/18

Sponsor Oversight of Agents – D

1/1/17 – 12/31/17

First Monday of February in 2018

6/30/18


2.3 Part C and Part D Reporting Sections Requiring Data Validation in 2019


The 2018 DV includes the 8 Part C and Part D reporting sections included in Exhibit 5. Please note that all 8 reporting sections require a reporting deadline of 2/25/19 or before.

Exhibit 6. Part C and Part D Reporting Sections Requiring Data Validation in 2018

2018 Reporting Section

Reporting Period(s)

Data Submission Due

Date(s) to CMS

DV Findings Due to CMS*

Part C





Grievances

1/1/18 - 3/31/18

4/1/18 - 6/30/18

7/1/18 - 9/30/18

10/1/18 - 12/31/18

First Monday of February in 2019

6/30/19

Organization Determinations/ Reconsiderations

1/1/18 - 3/31/18

4/1/18 - 6/30/18

7/1/18 - 9/30/18

10/1/18 - 12/31/18

Last Monday of February in 2019

6/30/19

Sponsor Oversight of Agents - C



1/1/18 - 12/31/18

First Monday of February in 2019

6/30/19

Special Needs Plans (SNPs) Care

Management


1/1/18 - 12/31/18

Last Monday of February in 2019

6/30/19

Part D

Medication Therapy Management Programs

1/1/18 - 12/31/18

First Monday of February in 2019

6/30/19

Grievances

1/1/18 - 3/31/18

4/1/18 - 6/30/18

7/1/18 - 9/30/18

10/1/18 - 12/31/18

First Monday of February in 2019

6/30/19

Coverage Determinations and

Redeterminations

1/1/18 - 3/31/18

4/1/18 - 6/30/18

7/1/18 - 9/30/18

10/1/18 - 12/31/18

Last Monday of February in 2019

6/30/19

Sponsor Oversight of Agents – D


1/1/18 – 12/31/18

First Monday of February in 2019

6/30/19


2.4 Reporting Requirements Excluded from the Validation Requirement at This Time



Ten Part C and Part D reporting sections included in the Technical Specifications will not undergo validation in 2017, as they have either been suspended from reporting, or will be used for monitoring purposes only. Exhibit 7 lists the reporting sections required for reporting but excluded from the DV review at this time.



Exhibit 7. Part C and Part D Reporting Sections Excluded From Data Validation

Part C Reporting Sections

Part D Reporting Sections


  • Enrollment/ Disenrollment

  • Employer Group Plan Sponsors

  • PFFS Plan Enrollment Verification Calls

  • PFFS Provider Payment Dispute Resolution Process

  • Rewards and Incentives Programs

  • Mid-Year Network Changes

  • Payments to Providers

  • Enrollment/ Disenrollment

  • Retail, Home Infusions, and Long-Term Care Pharmacy Access

  • Employer/ Union- Sponsored Group Health Plan Sponsors

  • Improving drug utilization


3 PLANNING FOR DATA VALIDATION ACTIVITIES

3.1 Select appropriate Reviewer based on Standards for Selecting a Data Validation Contractor



CMS requires that the DV be conducted by an independent, external entity, and believes that this will ensure that the data used to develop plan performance measures are credible to other stakeholders, and that information used to respond to Congressional and public inquiries are reliable for monitoring plans. The SO is responsible for acquiring the independent Data Validation Contractor (DVC) and for all other costs associated with completing the independent DV and reporting the results to CMS.



3.1.1 Standards for Selecting a Data Validation Contractor


CMS has provided a set of Standards for Selecting a DVC (Appendix A) as guidance for SOs to use in acquiring a contractor. These standards describe the minimum qualifications, credentials, and resources that the selected DVC must possess, as well as the conduct that the DVC must exhibit. SOs must acquire one Data Validation Contractor to conduct the validation on reported data and if necessary, the DVC may subcontract in order to ensure that it has the expertise required for each DV area and to meet the minimum standards. SOs may use their own staff only to assist the DVC in obtaining the information, data, and documents needed to complete the DV review.

SOs may also permit a different DVC with reviewers to perform mock audits, pre-assessments, and any other types of review throughout the year. However, in order to meet CMS’ standards for organizational independence, a SO may not use the same DVC who conducted these activities to conduct the subsequent DV review of those reported data. More detailed information pertaining to organizational independence is included in Appendix A, Standards for Selecting a Data Validation Contractor. While the DVC conducting the formal DV review may not participate in mock audits, pre-assessments, or other types of reviews, the reviewer can begin preparing for the DV review prior to April 1 so that the validation review can begin as soon as possible as of April 1. These types of preparation activities may include:

    • Meeting with the SO to discuss the validation process, resource needs, timeline, etc.

    • Providing the SO with a list of documents, data, and materials that are needed to complete the review.

Any specific questions about what types of activities are permitted prior to April 1 or regarding whether or not a particular entity meets the organizational independence standard should be directed to [email protected].

The Standards for Selecting a Data Validation Contractor also contain best practices that reviewers are expected to adhere to throughout the course of the review. The reviewer should remain an objective, independent third party and avoid acting in a consulting capacity. The reviewer should remain impartial in all of its activities and focused on determining if SOs’ systems, programs, data, etc. are accurate, reliable, valid, and complete based on instructions and standards outlined in Appendix A and CMS’ policies. The reviewer should provide general feedback and specific information on deficiencies to help SOs improve, and should maintain confidentiality of SOs’ privileged information.



3.1.2 Timing of Data Validation Contractor Selection

An SO may select a DVC reviewer at any time, up to and during the April through June DV review period. SOs should implement the contract to allow sufficient time for the reviewer to perform all of the requirements of the review during the required timeframe and submit findings to CMS via the PRDVM in HPMS by June 30.



3.1.3 Requesting a Contractor Change Mid-Review

An SO may not change its DVC reviewer during the formal review period (April-June) unless there are conditions that are unrelated to DV findings such as negligence or malfeasance on the part of the reviewer. If a change in contractor is required, the new reviewer is required to complete the DV review in its entirety (starting with the OAI analysis through the submission of findings to CMS) within the required April - June DV review timeline.

CMS will consider DVC reviewer change requests submitted mid-review on a case-by-case basis only. Requests must be in writing and be submitted to CMS via the [email protected] email box.

3.2 Notify CMS of DVC Selection / Request Access to Health Plan Management System (HPMS) Plan Reporting Data Validation Module (PRDVM)

3.2.1 Documentation of Data Validation Contractor Selection Process

SOs must document their DVC selection process and be able to show, upon request by CMS, how their chosen reviewer meets the minimum qualifications, credentials, and resources set forth in the Standards for Selecting a Data Validation Contractor. This includes maintaining a copy of the documentation that all DVC staff assigned to the applicable DV review team completed the CMS Data Validation Training program (see Section 3.3). CMS requires that the SO retain this documentation for the 10-year retention period per federal regulations2.

If an SO chooses to select the same DV reviewer it used for a previous year’s DV review, it must still document the selection process as described above.


3.2.2 Request Access to HPMS Plan Reporting Data Validation Module

Once the SO has selected a DVC, the next step is for the reviewer to request access to the PRDVM in HPMS. This module allows users to upload and review DV findings and submit them to CMS. The credentials assigned to a user will allow that individual to access only the PRDVM and those SO(s)/contract(s) with which they are associated. The reviewer will use these credentials to access the appropriate screen(s) to upload DV findings within the PRDVM starting no earlier than April 1 of the calendar year.


3.2.2.1 Process for Sponsoring Organization

Each SO is required to provide its reviewer with an official letter from its SO in either hardcopy or an emailed pdf format attachment. This letter must contain the following in order for individuals representing the reviewer to gain access to the PRDVM:


  • The SOs acknowledgment that it has contracted with the selected reviewer to complete the review,

  • The name of each individual who requires access (up to 5 individuals),

  • The type of functionality that each individual user requires,

  • Acknowledgement that the individuals have completed the web-based DV Training,

  • The contract number(s) the reviewer will need access to, and

  • The SOs Chief Executive Officers (CEO) signature.


Model language for this letter can be found in the Model Language for Letter to Confirm Selection of Data Validation Contractor (Appendix C).

If an SO chooses to select the same DV reviewer it used for a previous year’s DV review, it must still provide the reviewer with this signed letter for the current year’s DV activities.


3.2.2.2 Process for Data Validation Reviewer

DV reviewers must obtain individual access to the HPMS PRDVM. If the designated user(s) from the reviewer does not have active access to HPMS, each user should download the Application for Access to CMS Computer Systems at : Access to CMS Data & Application and follow the instructions provided in Example Application for Access to CMS Computer Systems (Appendix D) for requesting reviewer access to the HPMS PRDVM. CMS will allow up to 5 individuals from each reviewer to have access to this Module on behalf of each SO. One application must be completed for each user. The reviewer must send the completed application(s), along with the letter from each SO (signed by the CEO) for which they are under contract to complete the DV review. These documents may be sent as email attachments to daniel.summers@cms.hhs.gov or may be sent via traceable carrier to:


Daniel Summers

Re: Plan Reporting Data Validation Reviewer HPMS Access

7500 Security Blvd.

Location: C4-17-24 / Mailstop: C4-18-13

Baltimore, MD 21244-1850


If a DV reviewer is serving multiple SOs, only one CMS user access form is required for each of that contractor’s PRDVM users, however, a letter must be provided from each SO for which the individual reviewer will be serving as an agent in HPMS.

The process for gaining access to the PRDVM in HPMS can begin with the submission of only one application and the letter from the SO. The DVC can submit new applications as they are obtained, along with a copy of the SO letter, until they have reached the limit of 5 individuals.

For individuals that already have active CMS Enterprise User Administration (EUA) User IDs and HPMS access, a new Application for Access to CMS Computer Systems is not necessary. Instead, their current credentials must be modified to allow access to the PRDVM. For this access, individuals need to ensure that the letter from each SO linking the DVC to the SO (signed by the CEO) includes the individual’s current User ID and an explanation that the user already has HPMS Access. This letter must be sent to CMS via email or traceable carrier to the address indicated above.

The findings from the annual DV review must be submitted to CMS by June 30 of each calendar year. To assure timely access to the HPMS PRDVM to meet this annual DV timeframe, CMS strongly recommends requests for HPMS PRDVM access be submitted by April 6. Any requests received after this date will be processed on a rolling basis. It will take approximately four weeks for the designated individuals to obtain the credentials (CMS EUA User IDs and passwords) to access the PRDVM.

For DVC staff that participated in a previous year’s DV and already have an active CMS Enterprise User Administration (EUA) User ID and HPMS access, a new Application for Access to CMS Computer Systems is not necessary. However, these individuals must still follow the process described above to provide CMS with the letter from the SO linking the DVC to the SO in order to obtain access to the HPMS PRDVM for the current year’s DV activities.


3.3 Complete the Web-based Data Validation Training

CMS has developed a web-based DV Training that provides an opportunity for SOs and DV reviewers to learn more about the DV program and its specific requirements. The training is offered through the CMS Medicare Learning Network and can be found at: http://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNProducts/WebBasedTraining.html.

During the DV preparation phase, all SO staff involved in the DV should complete the CMS DV Training individually to familiarize themselves with the DV process and requirements.

Additionally, all reviewer staff assigned to a DV review team are required to take the CMS DV Training prior to working on the DV project. Once the training is completed, a certificate of completion is generated. The reviewer must provide this documentation to any hiring SO for all staff assigned to the applicable DV review team before commencing work on the DV.

CMS plans to offer continuing education credits resulting from successful completion of the DV Training. The continuing education certificate will be automatically generated upon successful completion of the course along with the certificate of completion.

Any DV reviewer staff that participated in a previous year’s DV must still take the current year’s CMS DV Training prior to working on the DV project and provide documentation to the hiring SO the current year’s training was completed before commencing work on the DV.

3.4 Review all Data Validation Documents

As noted in Section 1.4, there are 13 documents (including this Manual) that should be reviewed well in advance of the DV period. This Manual describes these materials. This section will focus specifically on the DV standards, which are further described in Data Validation Standards (Appendix B).


3.4.1 Introduction to the Data Validation Standards

The DV Standards include general standards and reporting section criteria that DV reviewers must use to determine whether the data each SO reported to CMS per the Part C/Part D Reporting Requirements Technical Specifications are accurate, valid, timely, and reliable.

The standards assess an SO’s information systems capabilities and its processes for collecting, storing, compiling, and reporting Part C and/or Part D data. They also assess whether SOs follow the applicable Technical Specifications to compile data, take into account appropriate data exclusions, and verify calculations, computer code, and algorithms.

In preparation for the DV process, both the SO and the reviewer must review and learn the standards. Refer to Appendix B for the complete set of Part C and Part D Reporting Section Data Validation Standards along with guidance related to interpreting the standards.



3.4.2 Data Validation Standards and Reporting Section Criteria

3.4.2.1 Data Validation Standards Instructions


The DV Standards include identical instructions relating to the types of information that must be reviewed for each reporting section, a set of validation standards (also identical for each reporting section), and reporting section criteria that are based on the applicable Technical Specifications.

The DV reviewer must use these standards in conjunction with the Data Extraction and Sampling Instructions (Appendix H) and the Excel version FDCF (Appendix J) to evaluate the SO’s processes for producing and reporting the reporting sections. CMS strongly recommends that the reviewer and the SO’s leadership team and reporting section report owners/ data providers review the DV Standards documentation before and during the review of each reporting section to ensure that they thoroughly understand the standards and reporting section criteria. This will also help to ensure all applicable data fields are extracted for each reporting section.

The top portion of each set of standards (which is identical for each reporting section) details the documents and reports that the reviewer is required to use to determine compliance with the standards for each specific reporting section. The documents and reports are listed within the gray box underneath the name of the applicable reporting section and are displayed in Exhibit 8.


Exhibit 8. General Instructions for Data Validation Standards

[NAME OF REPORTING SECTION]

To determine compliance with the standards for [name of reporting section], the reviewer will assess the following information:

  • Written response to OAI Sections 3 and 4, and documentation requested per OAI Sections 5 and 6

  • Results of interviews with organization staff

  • Census and/or sample data

  • Data file created for submission to CMS and copy of HPMS screen shots of data entered

  • Other relevant information provided by organization


Also contained within this section, if applicable, are notes to the reviewer regarding a specific reporting section and any nuances or differences that may be encountered during the review of that reporting section. See Exhibit 9 for an example “Note to reviewer” for the Part C Grievances reporting section.

Exhibit 9. Example “Note to Reviewer “in Data Validation Standards

GRIEVANCES (PART C)

(for 2016 REPORTED DATA)

Note to reviewer: Aggregate all quarterly data before applying the 90% threshold.

Note to reviewer: Apply the 90% threshold to the total count of grievances calculated. Do not apply the 90% threshold to individual grievance categories.

I

The second section of each set of standards is identical for all Part C and Part D reporting sections.

3.4.2.2 Data Validation Standards 1 - 7

3.4.2.2.1 Standard 1

Standard 1 (see Exhibit 10) contains the general and specific criteria for validating source documentation that the SO provides to the reviewer.


Exhibit 10. Standard 1: Required Data Fields Are Accurately Captured and Properly Documented

DATA VALIDATION STANDARD 1

1

A review of source documents (e.g., programming code, spreadsheet formulas, analysis plans, saved data queries, file layouts, process flows) indicates that all source documents accurately capture required data fields and are properly documented.

Criteria for Validating Source Documents:

  1. Source documents are properly secured so that source documents can be retrieved at any time to validate the information submitted to CMS via CMS systems.

  2. Source documents create all required data fields for reporting requirements.

  3. Source documents are error-free (e.g., programming code and spreadsheet formulas have no messages or warnings indicating errors, use correct fields, have appropriate data selection, etc.).

  4. All data fields have meaningful, consistent labels (e.g., label field for patient ID as Patient ID, rather than Field1 and maintain the same field name across data sets).

  5. Data file locations are referenced correctly.

  6. If used, macros are properly documented.

  7. Source documents are clearly and adequately documented.

  8. Titles and footnotes on reports and tables are accurate.

  9. Version control of source documents is appropriately applied.

3.4.2.2.2 Standard 2

Standard 2 (see Exhibit 11) instructs the reviewer to validate the completeness of the underlying data and the accuracy of each reported reporting section. Standard 2 provides an overview of reporting section criteria, which must be met for each of the Part C and Part D reporting section being reported and is further detailed in section 4.2.3. For example, the reporting section criteria assess whether the appropriate date ranges for the reporting period are captured by the data system, and whether the expected counts and calculations are accurate and match the corresponding source code and analysis plan. The criteria are also used to verify that the SO has properly interpreted and defined key terms used to determine which data are applicable. For example, the SO must properly define the terms “Coverage Determinations and Redeterminations” in accordance with CMS regulations, guidance and the Technical Specifications in order to ensure the quality of the reported data for that reporting section. Standard 2e is further broken down into additional criteria that map to the relevant technical specification data elements.

Exhibit 11. Standard 2: Data Elements Are Accurately Identified, Processed, and Calculated


DATA VALIDATION STANDARD 2

2

A review of source documents (e.g., programming code, spreadsheet formulas, analysis plans, saved data queries, file layouts, process flows) and census or sample data, whichever is applicable, indicates that data elements for each reporting section are accurately identified, processed, and calculated.

Criteria for Validating Reporting Section Criteria (Refer to reporting section criteria section below):

  1. The appropriate date range(s) for the reporting period(s) is captured.

  2. Data are assigned at the applicable level (e.g., plan benefit package or contract level).

  3. Appropriate deadlines are met for reporting data.

  4. Terms used are properly defined per CMS regulations, guidance and Reporting Requirements Technical Specifications.

  5. The number of expected counts (e.g., number of members, claims, grievances, procedures) are verified; ranges of data fields are verified; all calculations (e.g., derived data fields) are verified; missing data has been properly addressed; reporting output matches corresponding source documents (e.g., programming code, saved queries, analysis plans); version control of reported data elements is appropriately applied; QA checks/thresholds are applied to detect outlier or erroneous data prior to data submission.

ION STANDARD 2

3.4.2.2.3 Standard 3

Standard 3 (see Exhibit 12) is used to determine whether an SO implements policies and procedures for each reporting section’s data submission. Not only should the reviewer validate that the reported data were correctly derived from the underlying database, but they should also verify that the data are accurately uploaded and/or entered into CMS systems. If a reporting section requires both a file upload and data entry, both have to occur in order for a SO to meet Sub-Standard 3a.


Exhibit 12. Standard 3: Data Submission

DATA VALIDATION STANDARD 3

3

Organization implements policies and procedures for data submission, including the following:

  1. Data elements are accurately entered / uploaded into CMS systems and entries match corresponding source documents.

  2. All source, intermediate, and final stage data sets and other outputs relied upon to enter data into CMS systems are archived.



3.4.2.2.4 Standards 4 and 5

For Standards 4 and 5 (see Exhibit 13), the reviewer must verify that the SO has, and implements, policies and procedures for regular database updates, and for data archiving and restoration. This ensures that data are kept up to date, and that systems are in place for timely data submission or re-submission in the event of data loss.


Exhibit 13. Standards 4 and 5: Data System Updates and Archive/Restoration

DATA VALIDATION STANDARD 4 & 5

4

Organization implements policies and procedures for periodic data system updates (e.g., changes in enrollment, provider/pharmacy status, and claims adjustments).

5

Organization implements policies and procedures for archiving and restoring data in each data system (e.g., disaster recovery plan).

3.4.2.2.5 Standards 6 and 7

Standards 6 and 7 (see Exhibit 14) are applicable only in certain situations. Standard 6 is applicable if an SO’s data systems underwent any changes during the reporting period. If this occurred, the reviewer must examine documentation of the changes to ensure there were no issues that adversely impacted the reported data.

Standard 7 applies if any of the data collection or validation processes are outsourced to another entity. This standard assesses whether the SO has policies and procedures in place that address routine monitoring of the delegated entities work and whether those policies and procedures are implemented.

The reviewer should mark “Not Applicable” in the Excel-version FDCF if Standard 6 or 7 is not applicable to the reporting section or contract under review.


Exhibit 14. Standards 6 and 7: Data System Changes and Oversight of Delegated Entity Reporting


DATA VALIDATION STANDARDS 6 AND 7

6

If organization’s data systems underwent any changes during the reporting period (e.g., as a result of a merger, acquisition, or upgrade): Organization provided documentation on the data system changes and, upon review, there were no issues that adversely impacted data reported.

7

If data collection and/or reporting for this reporting section are delegated to another entity: Organization regularly monitors the quality and timeliness of the data collected and/or reported by the delegated entity or first tier/downstream reviewer.


3.4.3 Reporting Section Criteria

In addition to the general instructions and validation standards, there is a third section, which contains the reporting section criteria. The reporting section criteria vary for each Part C and Part D reporting section. Reporting section criteria are used in conjunction with Standard 2 to determine if data elements are accurately identified, processed, and calculated. The first three reporting section criteria for each reporting section (see Exhibit 15) are used to validate whether the SO is utilizing the appropriate reporting period, reporting level, and reporting deadline(s) per CMS requirements.


Exhibit 15. Example Reporting Section Criteria for Appropriate Reporting Period, Reporting Level, and Reporting Deadline

REPORTING SECTION CRITERIA

1

Organization reports data based on the required reporting period of 1/1 through 12/31.

2

Organization properly assigns data to the applicable CMS contract and plan benefit package.

3

Organization meets deadline for reporting annual data to CMS by 2/27/17.

Note to reviewer: If the organization has, for any reason, re-submitted its data to CMS for this reporting section, the reviewer should verify that the organization’s original data submission met the CMS deadline in order to have a finding of “yes” for this reporting section criterion. However, if the organization re-submits data for any reason and if the re-submission was completed by 3/31 of the data validation year, the reviewer should use the organization’s corrected data submission for rest of the reporting section criteria for this reporting section.


Several of the reporting section standards contain a reporting section criterion to validate whether the SO properly defined key terms that it used to compile reported data per CMS regulations, guidance and the Technical Specifications. Exhibit 16 shows an example of this criterion for the Part D Coverage Determinations and Redeterminations reporting section.





Exhibit 16. Reporting Section Criterion for Defining Key Terms

REPORTING SECTION CRITERIA

4

Organization properly determines whether a request is subject to the coverage determinations or the exceptions process

in accordance with 42 CFR §423.566, §423.578, and the Prescription Drug Benefit Manual Chapter 18, Sections 10 and

30. This includes applying all relevant guidance properly when performing its calculations and categorizations for the above-mentioned regulations in addition to 42 CFR §423.568, §423.570, §423.572, §423.576 and the Prescription Drug Benefit Manual Chapter 18, Sections 40, 50, and 130. Organization properly defines the term Redetermination in

accordance with Title 42, Part 423, Subpart M §423.560, §423.580, §423.582, §423.584, and §423.590 and the

Prescription Drug Benefit Manual Chapter 18, Section 10, 70, and 130. This includes applying all relevant guidance properly when performing its calculations and categorizations.


The other reporting section criteria reference the applicable data element from the Technical Specifications when possible and differ considerably depending on the reporting section and data element. Exhibit 17 shows an example of selected reporting section criteria applicable to the Part C Grievances reporting section. The exact criteria for each Part C and D reporting section are based on the Technical Specifications.


Exhibit 17. Reporting Section Criteria for Selected Part C Grievances Data Elements

REPORTING SECTION CRITERIA

5

Organization accurately calculates the total number of grievances, including the following criteria:

  1. Includes all grievances that were completed (i.e., organization has notified member of its decision) during the reporting period, regardless of when the grievance was received.

  2. Includes all grievances reported by or on behalf of members who were previously eligible, regardless of whether the member was eligible on the date that the grievance was reported to the organization.

  3. If a grievance contains multiple issues filed under a single complainant, each issue is calculated as a separate grievance.

  4. If a member files a grievance and then files a subsequent grievance on the same issue prior to the organization’s decision or the deadline for decision notification (whichever is earlier), then the issue is counted as one grievance.

  5. If a member files a grievance and then files a subsequent grievance on the same issue after the organization’s decision or deadline for decision notification (whichever is earlier), then the issue is counted as a separate grievance.

  6. Includes all methods of grievance receipt (e.g., telephone, letter, fax, and in-person).

  7. Includes all grievances regardless of who filed the grievance (e.g., member or appointed representative)

  8. Includes only grievances that are filed directly with the organization (e.g., excludes all complaints that are only forwarded to the organization from the CMS Complaint Tracking Module (CTM) and not filed directly with the organization). If a member files the same complaint both directly with the organization and via the CTM, the organization includes only the grievance that was filed directly with the organization and excludes the identical CTM complaint.

  9. For MA-PD contracts: Includes only grievances that apply to the Part C benefit (If a clear distinction cannot be made for an MA-PD, cases are reported as Part C grievances).

  10. Excludes withdrawn grievances.

[Data Elements 5.1, 5.3, 5.5, 5.7, 5.9, 5.11, 5.13, 5.15, 5.17, 5.19, 5.21]

6

Organization accurately calculates the number of grievances by category, including the following criteria:

  1. Properly sorts the total number of grievances by grievance category: Enrollment/Disenrollment; Benefit Package; Access; Marketing; Customer Service; Organization Determination and Reconsideration Process; Quality of Care; CMS Issues.

  2. Grievances not falling in a specific listed category are properly assigned toOther Grievances.

[Data Elements 5.1, 5.3, 5.5, 5.7, 5.9, 5.11, 5.13, 5.15, 5.17, 5.19, 5.21]

4 PERFORMING DATA VALIDATION ACTIVITIES

4.1 Complete Organizational Assessment Instrument (OAI) and Provide Appropriate Documentation to Selected DVC per the OAI’s Documentation Request



The Organizational Assessment Instrument (OAI) (Appendix E) focuses on how the SO collects, stores, and reports data. Completing the OAI is mandatory and CMS highly recommends that SOs complete this document in advance of the DV, as the DV review relies significantly on the information captured in this tool. The completed OAI may reduce required reviewer resources, and make the DV review more efficient and effective. SOs should provide the completed OAI to their selected DV reviewer electronically. CMS estimates that the OAI should take a minimum of two weeks to complete and should be submitted to the reviewer no later than early April. SOs may not send their completed OAI or source code, SOPs, etc. to their reviewers prior to the start of the DV cycle on April 1.

Each SO must provide to its reviewer the basic information regarding its Medicare contracts and which Part C and/or Part D reporting sections each contract submits to CMS. SOs that hold more than one contract with CMS only need to complete one version of the OAI that covers all of its contracts. If the information provided in the OAI varies by contract, the document allows for the flexibility to identify the differences for the reviewer in applicable sections.

All documentation and responses to questions in the OAI should reflect the SO’s systems and processes that were in place during the reporting period(s) undergoing the DV review. For example, if the data being reviewed are for the 2016 reporting period, the SO should include only diagrams of the information systems in place in 2016 or the programming code used in 2016 to calculate the reporting sections.

It is up to the SO and its DV reviewer to work out mutually agreeable methods for sharing and protecting proprietary data, such as that requested in the OAI, and protected health information. The Standards for Selecting a Data Validation Contractor (Appendix A) includes minimum security requirements with which the reviewer’s facility, equipment, and processes must comply. The SO is responsible for ensuring that the reviewer complies with all Health Insurance Portability and Accountability Act (HIPAA) privacy and security requirements.

The SO must supply all the information required for the DV review; otherwise, it will be out of compliance with CMS requirements and will be subject to compliance actions from CMS. If an SO contracts with delegated entities (e.g., PBMs) that are not cooperative in supplying required information, the SO is still responsible for the required information and it is up to the SO to determine how to proceed. Additionally, if an SO or its delegated entity does not provide the information required to determine if a standard or sub-standard has been met, the reviewer is required to select “No” in the FDCF for that standard or sub-standard.



4.2 Analyze OAI Responses


CMS recommends DV reviewers perform a preliminary review of the documentation submitted in the OAI in advance of each site visit so that any follow-up regarding the documentation can be done during the site visit. The documentation submitted by the SO when completing the OAI should be adequate and enabling of an effective review. The amount of detail provided in the documentation will determine the ease of the review process, especially for the review of programming code/source code.

Additionally, the OAI provides supplemental questions to help the reviewer better understand the processes used by the SO to compile and submit its reporting sections. The SO’s responses to these questions will provide insight as to who is responsible for the quality control and submission of the data, the processes for incorporating CMS updates to the Technical Specifications into the SO’s systems, and descriptions of any past issues that may have occurred during the reporting process.



      1. Perform OAI Gap Analysis

Upon receiving the completed OAI, the reviewer should review the document for completeness and accuracy. Sections of the OAI that are missing or incomplete should be noted, and the reviewer should follow-up with the SO to complete. It is up to the reviewer to determine whether any identified gaps in the OAI responses require addressing prior to the site visit, or can be addressed during the site visit portion of the review.



      1. Review Source Code and Other Documentation

Data dictionaries and source code are critical for allowing the reviewers to map ambiguous field names and internal status codes to meaningful descriptions. Well organized and structured documentation of the reporting and data extraction processes for the various reporting sections will assist the reviewer in gaining a more thorough understanding of the SO. Reviewers should be familiar with data systems and processes detailed by the SO in the OAI to ensure thorough preparation for the site visit.



      1. Prepare Interview Discussion Guide

The Interview Discussion Guide (IDG) (Appendix F) is intended to facilitate the discussion between the DV reviewer and the SO’s report owners and subject matter experts. The IDG is a dynamic tool containing both general and reporting section questions that can guide an effective discussion regarding an SO’s underlying data systems and reporting processes. If, during review of the documentation provided in response to the OAI, the reviewer discovers evidence that may indicate errors in the SO’s data or reporting processes, the reviewer should modify the IDG used for that SO with new questions that may identify any vulnerabilities or opportunities for repeated errors with data collection or reporting. Additionally, the IDG should serve as a “guide” for the reviewer; it is up to the reviewer’s discretion to include additional questions and/or detail to the document to discuss during site visit interviews and to ensure the additional detail is documented accordingly.



4.3 Prepare for Site Visit

4.3.1 Select Dates and Appropriate Location(s) of Site Visit

CMS requires SO and reviewers to include a site visit as part of the DV review to conduct the following activities: (1) conduct interviews with SO staff, (2) observe the SO’s reporting processes, and (3) obtain census and/or sample files to support the validation of Part C and Part D reporting sections. SOs and DV reviewers are responsible for determining mutually agreeable dates for performing the site visit. It is estimated that the site visit for a full Part C and Part D data validation review should take up to one week to complete.

It is up to the discretion of the reviewer to determine the most appropriate location(s) of the site visit (e.g., virtual, SO’s facility, PBM’s facility, other delegated entity’s facility). CMS encourages SOs and DV reviewers to conduct a physical site visit, but SOs and DV reviewers may elect to conduct a virtual site visit, using a virtual meeting tool or teleconference(s), if appropriate.



4.3.2 Develop Agenda for Site Visit

To further prepare for each SO’s site visit, the DV reviewer and SO should work together to prepare a site visit agenda. Appendix G contains a sample agenda that can be used for the site visit portion of the DV review. This agenda is structured to include an entrance and exit conference, and interviews and demonstrations of data systems for each reporting section included in the DV. It is also recommended that the reviewer create a sign-in sheet to be completed throughout the site visit in order to collect contact information for each SO’s report owners and subject matter experts in case any follow-up is required.

It is important to note that the number of days required to complete the site visit may be contingent upon the size of the SO, efficiency of the SO’s operations, level of reporting automation, and scope of the DV review. The reviewer must schedule sessions with the SO’s report owner(s) for each reporting section and allow sufficient time for the SO to provide an overview of each of the relevant data systems used in gathering data and producing reports, as well as to complete the data extraction/sampling process (see Section 4.4.4 for more information). Multiple sessions could be conducted concurrently during the site visit at the discretion of the review team, or the agenda could be structured so that interviews and demonstrations of reporting processes are scheduled by each report owner in order to reduce repetitive discussions and demonstrations, especially in cases where one report owner oversees the processes for multiple reporting sections that use the same data system(s). This will ensure optimal time and resource management during the site visit.



4.3.3 Prepare for Data Extraction and Sampling

In preparation for the data extraction and sampling during the site visit, the DV reviewer should review information provided in the completed OAI and, if necessary, hold conference calls with the SO to discuss the SO’s processes. Calls held specifically with each reporting section’s report owner can also provide an opportunity for the reviewer to review the Data Extraction and Sampling Instructions (Appendix H) in more detail and for the report owners to seek clarification as needed. These discussions can also inform the reviewer about the SO’s data systems and sources from which the sample data would be pulled.

There are two methodologies that can be used to extract data for each reporting section. The first is to extract the full census of data for a reporting section, meaning that every data record that is relevant to a reporting section is extracted. When possible, reviewers should attempt to extract the full census. Extracting the census will enable the reviewer to determine with the greatest precision whether reporting sections were submitted accurately. If the size or complexity of a database presents an unusual time burden on the reviewer and/or SO, then the second method, extraction of a random sample, which is a subset of the full census, can be used. Reviewers must use their best judgment to decide if extracting a full census is feasible, or if selecting a random sample will provide the data necessary for the DV review. Refer to Appendix H for further details regarding these two methodologies. In addition, reviewers must determine if the SO’s staff requires supervision during the actual data extraction process, or if the SO’s staff are able to extract data without supervision. See Section 4.4.4 for additional requirements if the reviewer is unable to supervise the data extraction process.



4.4 Conduct Site Visit

4.4.1 Conduct Entrance Conference

The entrance conference provides an opportunity for the DV review team and the SO’s management and individual report owners to introduce themselves and discuss expectations for the site visit. At the entrance conference, the reviewer should describe the objectives for the review and discuss any administrative needs of the team. Optionally, the SO may provide a high-level overview of its organization, focusing on its operations with respect to meeting the CMS reporting requirements. CMS recommends that the entire review team also meet briefly with the SO’s management and individual report owners at the beginning of each day of the site visit to go over administrative needs and review the day’s agenda.



4.4.2 Conduct Interviews with Organization Staff

During the site visit, the reviewer must conduct interviews with the subject matter experts and report owners for each reporting section and reporting system. These interviews provide a first-hand opportunity for the reviewer to gain a thorough understanding of each SO’s data collection and reporting processes involved with meeting CMS reporting requirements. The reviewer should reference the IDG as needed to ensure that all key topics are addressed during the interviews. Also, any outstanding questions and follow-up items identified during the analysis of OAI responses should be addressed during the interviews.



4.4.3 Observe Reporting Processes

The site visit allows the opportunity for the SO to provide a significant amount of useful information to the reviewer. Designated SO staff (i.e., report owners) must provide visual demonstrations of the data systems and reporting processes including data extraction from originating data sources, data analysis, quality assurance processes, and processes for entering or uploading final data into CMS systems. The following is a sample list of the parts of the process that should be demonstrated:

  • Location of report owner and data providers

  • Location and function of all data warehouses

  • Types of data used (format, amount of tables)

  • Links and joins to other areas/ departments/ data

  • Types of programming used to create the reports

  • Review processes and oversight

  • Timeframes for the process (amount of time it takes to run specific parts of the report)

  • Approximations of report volume

  • Updates to the process and system changes

  • Storage locations, security and access constraints



The visual demonstrations provide a clear illustration of the reporting processes, provide the reviewer with insight into the SO’s ability to ensure accurate, valid and timely data, and allow an opportunity to get immediate responses to any questions or concerns about the reported data.


4.4.4 Extract Census or Sample Data

The next step is for the reviewer to work with the report owners to draw a census or a random sample from each reporting section’s final stage data set, following the Data Extraction and Sampling Instructions (Appendix H). The document describes guidelines and methodologies for extracting SOs’ data. Two methodologies of extraction are available to reviewers. The first method is referred to as the census. Extracting all records used in the calculation of data elements for a specific reporting section would constitute extracting a census of data. When possible, reviewers should attempt to extract the full census. Extracting the census will enable the reviewer to determine with the greatest precision whether reporting sections were submitted accurately. The second method used for data extraction is a random sample. The random sample is a subset of the census data. If extraction of the census proves to be too burdensome due to the size or complexity of the data for a specific reporting section, a sample of records should be extracted instead.

The random sampling process involves using built-in random number generators in the applications used to display the data or perform the query (Microsoft Excel or SQL). Once a random number is assigned to each unique record ID, the data owner can sort the data by the random number field and choose the statistically appropriate number of records as the sample. A discussion of minimum sample sizes can be found in Data Extraction and Sampling Instructions. The unique IDs from the random sample in the final stage data set are then applied against the source data set to pull the corresponding source data records. The processes used to draw the random data samples vary considerably, depending on the report owner and reporting section. For example, some report owners may be able to easily draw the sample data for their reporting section without having to manually clean or manipulate the data, while other report owners may have to perform more extensive query programming and manual data cleaning in order to draw the sample data. During each of the sessions to demonstrate reporting processes, the SO’s report owners should brief the review team on the processes used to assemble the sample data files, including the source, intermediate, and final stage data sets. When uploading the DV findings to CMS in the PRDVM, the reviewer must report which data extraction method was used (full census or random sample) to validate data for each standard. For both methods, reviewers must examine source data as a means of verifying that the organization’s underlying data are correct: for example, reviewing customer service call logs or member letters to verify that grievances were properly categorized as grievances and to verify the grievance categories applied were correct. Source data examples for each reporting section are provided in the Data Extraction and Sampling Instructions. Reviewers are expected to include the number and percentage of errors or variance from HPMS-filed data found when examining the source data. For purposes of recording results in the FDCF, an error is any discrepancy that either impacted the number of events reported or has the potential to impact the number of events reported in future reporting periods. These errors must be reported in the “Review Results” area of the FDCF and include the sample size selected for the source data.

It is mandatory that DV reviewers follow the Data Extraction and Sampling Instructions. If the SO’s staff is extracting the data, it is highly recommended that the reviewer supervise the data extraction process to ensure these instructions are followed correctly. If the reviewer is unable to supervise the data extraction process, the reviewer must obtain documentation from the SO describing how the extraction process was performed. For example, if a random sample is extracted, the reviewer should request and validate the programming code used to extract the sample data. If a full census is extracted, the reviewer should validate that the record counts match between the census extraction and the source and final stage data files.

CMS recommends that the DV reviewer record details about each reporting section’s data set into a Data File Inventory Log (Appendix I). Appendix I contains an example log that the reviewer can use. It includes details such as the reporting section name, report owner, data file name, type of data file (e.g., source, intermediate, or final stage data file), number of rows or records, and a description of the file. By completing this log, the review team will be able to easily reference the data files during its post-site visit assessment of data. Appendix I is just an example, reviewers can use their own inventory log if that is preferable.

The SO should write all data files to tab-delimited or comma-delimited text files with variable names in the first row, and transfer these files to the DV reviewer’s secure storage device for each reporting section’s data. The SO must also provide the reviewer a file layout or data dictionary for the data files in either Word documents or Excel spreadsheets on the same secure storage device. The SO and DV reviewer must ensure that they have established mutually agreeable methods for sharing protected health information and that the reviewer complies with all HIPAA privacy and security requirements.



4.4.5 Conduct Exit Conference

CMS recommends that the entire DV review team meet briefly with the SO’s management and individual report owners at the end of each day of the site visit to go over any action items or outstanding documentation needs. The site visit should conclude with an exit conference, where the reviewer should provide the SO with a summary of next steps and note any follow-up that may need to occur.



4.5 Request Additional Documents (If Required)



CMS recognizes that it may not be possible to obtain all of the required data and documentation during the scheduled site visit and follow-up conversations and requests may be required. The reviewer should make every attempt to gather all required data and documentation during the site visit. In the event that not all information is available, or follow-up is required after the conclusion of the scheduled site visit, the reviewer should have additional conversations with the SO and/or make requests for documentation. Reviewers and SOs should understand that the DV is an iterative and collaborative effort, and SOs should be prepared to provide additional data and documentation after the site visit has been held.

5 ANALYZING RESULTS AND SUBMISSION OF FINDINGS

5.1 Determine Compliance with Data Validation Standards and Record Findings in Excel Version of the Findings Data Collection Form (FDCF) to Upload into the HPMS PRDVM

Following the site visit, the DV reviewer must assess the documentation and census/sample data received from the SO, as well as the information gained during the interviews and demonstrations of the SO’s reporting processes and information systems.

The DV reviewer must complete the Excel version of the FDCF as it determines the findings for each contract included in the scope of the review. Reviewers must now use the Excel version of the FDCF to record findings and then translate the findings into the PRDVM in a data file upload. The PRDVM data entry screens are no longer available. The FDCF mirror the content of the DV Standards document, but allow the reviewer to record notes, data sources referenced, and findings for the different standards and criteria specified for a given reporting section. The reviewer will record reporting section-level, and in some cases data element-level, findings for each reporting section. Most DV standards (Standards 1, 4, 5, 6, and 7) are assessed at the reporting section-level, as they assess SO processes that are not likely to vary at the data element-level. Once the reviewer uploads the FDCF into PRDVM, the reviewer may print the findings he/she uploaded into the PRDVM and share them with the SO at any point during the review by accessing the HPMS report entitled “Review Data Validation Findings Report.”

When using the FDCF, reviewers should only complete areas displayed in white for data sources, review results, and findings. Areas displayed in gray are not applicable and should not be completed. In the "Data Sources and Review Results:" column, the reviewer should enter the data sources used and review results for each standard or sub-standard. Next to this column, in the "Findings" column, the reviewer must select the appropriate choice based on whether or not the plan met the requirement for the standard or sub-standard.

      1. Reporting Findings for Standards Using Binary Scale

For all standards except 1.c, 1.d, 1.e, 1.g, 1.h, and 2.e, the findings are reported using binary scale. The reviewer must select "Y" if the requirements for the standard or sub-standard have been completely met. If any requirement for the standard or sub-standard has not been met, the reviewer must select "N." In instances where a standard or sub-standard is not applicable, the reviewer must select “N/A” and must enter the reason for the “N/A” in the “Review Results” field.

CMS expects that there will be situations when the reviewer finds that an SO is only in partial compliance with specific DV standards. CMS does not believe that only a 100 percent score demonstrates compliance, and has established a threshold whereby a minimum of 90% of records are accurate (e.g., sample or census records, source documents, policies and procedures, data entry records) in order to record a “Yes” finding for any standard. Applying this threshold to standards that require the review of policies and procedures should be done when it is possible to readily quantify the adherence to or implementation of said policies and procedures (see Exhibit 19). Exhibit 18 provides examples of how to calculate this minimum threshold specifically for Standard 3.a, for which the DV involves samples or the complete census of records and/or data values. Note that the 90% accuracy threshold does not apply to the individual grievance categories in the Part C and Part D Grievances reporting sections; 100% correct records are required for each data element measured by Standards 2e and 3a in these reporting sections.

Exhibit 18 Examples of Calculations to Determine Minimum Threshold of Correct Sample/Census Records for “Yes” Finding

Sample/Census

Size

Calculation for

Minimum Threshold


Minimum Threshold of Correct Records for Yes Finding

150

0.90 x 150=135

At least 135 of the records are correct for the standard to be recorded as “Yes.

205

0.90 x 205=184.5

At least 185 of the records are correct for the standard to be recorded as “Yes (round 184.5 to 185).


Exhibit 19 Example of How to Determine Minimum Threshold of Implemented Policies or Procedures for “Yes” Finding

Standard

Standard Description

Minimum Threshold for a Yes Finding

4

Organization implements policies and procedures for periodic data system updates (e.g., changes in enrollment, provider/pharmacy status, and claims adjustments).

SO has a policy in place for updating its enrollment system on a monthly basis to ensure accurate information and protect the data integrity.


Eleven out of the twelve months in the contract year, the SO implemented the enrollment system update policy as it is written (11/12 = 91.6%)

      1. Reporting Findings for Standards Using Likert Scale

For standards 1.c, 1.d, 1.e, 1.g, 1.h, and 2.e, the scoring has been changed from a binary scale to a five-point Likert-type scale. The reviewers are still required to review the percentage of records that meet the standards, but instead of entering a Y or N to indicate the plan’s performance, the reviewer must enter a score based on the Likert-type scale. The scale corresponds to the percentage of errors that are found in plan data as shown below:

  1. Plan data has more than 20 percent error in records – Plan will receive a score of 1 for the given standard/ sub-standard.

  2. Plan data has between 15.1 percent and 20 percent error in records – Plan will receive a score of 2 for the given standard/ sub-standard.

  3. Plan data has between 10.1 percent and 15 percent error in records – Plan will receive a score of 3 for the given standard/ sub-standard.

  4. Plan data has between 5.1 percent and 10 percent error in records – Plan will receive a score of 4 for the given standard/ sub-standard.

  5. Plan data has fewer than 5 percent error in records – Plan will receive a score of 5 for the given standard/ sub-standard.

Exhibit 20 provides the different scenarios that a reviewer might face and corresponding scores to be assigned to the plan for the given standard.

Exhibit 20 Example of How to Determine Minimum Threshold of Implemented Policies or Procedures for “Yes” Finding

Standard

Percentage of errors that are found by the reviewer in plan data

DV Response (DV findings reported on column 'H' on FDCF)

1a, 1.b, 1.i, 2.a, 2.b, 2.c, 2.d, 3.a, 3.b, 4, 5, 6, 7

Fewer than 10 percent error in plan data for the given reporting section/ data element(s)

Yes

1a, 1.b, 1.i, 2.a, 2.b, 2.c, 2.d, 3.a, 3.b, 4, 5, 6, 8

More than 10 percent error in plan data for the given reporting section/ data element(s)

No

1a, 1.b, 1.i, 2.a, 2.b, 2.c, 2.d, 3.a, 3.b, 4, 5, 6, 9

Standard not applicable

Leave blank

1.c, 1.d, 1.e, 1.g, 1.h, 2.e

If any of the listed standards is not applicable

Leave blank

1.c, 1.d, 1.e, 1.g, 1.h, 2.e

More than 20 percent error in plan data for the given reporting section/ data element(s)

1

1.c, 1.d, 1.e, 1.g, 1.h, 2.e

Between 15.1 percent and 20 percent error in plan data for the given reporting section/ data element(s)

2

1.c, 1.d, 1.e, 1.g, 1.h, 2.e

Between 10.1 percent and 15 percent error in plan data for the given reporting section/ data element(s)

3

1.c, 1.d, 1.e, 1.g, 1.h, 2.e

Between 5.1 percent and 10 percent error in plan data for the given reporting section/ data element(s)

4

1.c, 1.d, 1.e, 1.g, 1.h, 2.e

Fewer than 5 percent error in plan data for the given reporting section/ data element(s)

5

      1. Review of the Findings Data Collection Form

Exhibit 21 illustrates an example of the FDCF for Standard 1. The reviewer will assess this standard at the reporting section-level and must determine a finding for each of the nine sub-standards contained in Standard 1.


Exhibit 21 Example Rows from FDCF for Standard 1

Standard/ Sub-standard ID

Reporting Section Criteria ID

Standard/Sub-standard Description

 

Data Sources and Review Results:

Enter review results and/or data sources

Enter 'Findings' using the applicable choice in the appropriate cells. Cells marked with an '*' should not be edited.

 

1

 

A review of source documents (e.g., programming code, spreadsheet formulas, analysis plans, saved data queries, file layouts, process flows) indicates that all source documents accurately capture required data fields and are properly documented.


Data Sources:

*

1.a

 

Source documents are properly secured so that source documents can be retrieved at any time to validate the information submitted to CMS via CMS systems.


Review Results:

 

1.b

 

Source documents create all required data fields for reporting requirements.


Review Results:

 

1.c

 

Source documents are error-free (e.g., programming code and spreadsheet formulas have no messages or warnings indicating errors, use correct fields, have appropriate data selection, etc.).


Review Results:

 

1.d

 

All data fields have meaningful, consistent labels (e.g., label field for patient ID as Patient_ID, rather than Field1 and maintain the same field name across data sets).


Review Results:

 

1.e

 

Data file locations are referenced correctly.


Review Results:

 

1.f

 

If used, macros are properly documented.


Review Results:

 

1.g

 

Source documents are clearly and adequately documented.


Review Results:

 

1.h

 

Titles and footnotes on reports and tables are accurate.


Review Results:

 

1.i

 

Version control of source documents is appropriately applied.


Review Results:

 



Standard 2 requires the reviewer to assess reporting section-level findings for Sub-Standards 2.a through 2.c, which are based on reporting section criteria 1 through 3 and, if applicable, Sub-Standard 2.d, which is based on reporting section criterion 4. Exhibit 22 illustrates an example of the FDCF for Standard 2, Sub-Standards 2.a through 2.d. for the Part D Grievances reporting section.

Exhibit 22 example of the FDCF for Standard 2, Sub-Standards 2.a through 2.d. for the Part D Grievances reporting section

Standard/ Sub-standard ID

Reporting Section Criteria ID

Standard/Sub-standard Description

 

Data Sources and Review Results:

Enter review results and/or data sources

Enter 'Findings' using the applicable choice in the appropriate cells. Cells marked with an '*' should not be edited.

2

 

A review of source documents (e.g., programming code, spreadsheet formulas, analysis plans, saved data queries, file layouts, process flows) and census or sample data, whichever is applicable, indicates that data elements for each reporting section are accurately identified, processed, and calculated.


Data Sources:

*

2.a

RSC-1

The appropriate date range(s) for the reporting period(s) is captured.

Organization reports data based on the periods of 1/1 through 3/31, 4/1 through 6/30, 7/1 through 9/30, and 10/1 through 12/31.


Review Results:

 

2.b

RSC-2

Data are assigned at the applicable level (e.g., plan benefit package or contract level).

Organization properly assigns data to the applicable CMS contract.


Review Results:

 

2.c

RSC-3

Appropriate deadlines are met for reporting data (e.g., quarterly).

Organization meets deadlines for reporting data to CMS by 2/6/2017. [Note to reviewer: If the organization has, for any reason, re-submitted its data to CMS for this reporting section, the reviewer should verify that the organization’s original data submissions met the CMS deadline in order to have a finding of “yes” for this reporting section criterion. However, if the organization re-submits data for any reason and if the re-submission was completed by 3/31 of the data validation year, the reviewer should use the organization’s corrected data submission(s) for the rest of the reporting section criteria for this reporting section.]


Review Results:

 

2.d

RSC-4

Terms used are properly defined per CMS regulations, guidance and Reporting Requirements Technical Specifications.

Organization properly defines the term “Grievance” in accordance with 42 CFR §423.564 and the Prescription Drug Benefit Manual Chapter 18, Sections 10 and 20. This includes applying all relevant guidance properly when performing its calculations and categorizations. Requests for coverage determinations, exceptions, or redeterminations are not categorized as grievances.

 


Review Results:

 





The reviewer must also determine data element-level findings for Sub-Standard 2.e, which examines each data element for compliance with the applicable reporting section criteria that varies across the data elements reported by the SO. Exhibit 23 illustrates an example of the FDCF for Standard 2, Sub-Standard 2.e, RSC-6 for the Part D Grievance reporting section.


Exhibit 23 Example Rows from FDCF for Standard 2, Sub-Standard 2.e RSC-6 for Part D Grievances Reporting Section*

Standard/Sub-standard ID

Reporting Section Criteria ID

Standard/Sub-standard Description

 

Data Sources and Review Results: Enter review results and/or data sources

Enter 'Findings' using the applicable choice in the appropriate cells. Cells marked with an '*' should not be edited.

2.e

RSC-6

The number of expected counts (e.g., number of members, claims, grievances, procedures) are verified; ranges of data fields are verified; all calculations (e.g., derived data fields) are verified; missing data has been properly addressed; reporting output matches corresponding source documents (e.g., programming code, saved queries, analysis plans); version control of reported data elements is appropriately applied; QA checks/thresholds are applied to detect outlier or erroneous data prior to data submission.
RSC-5: Organization accurately calculates the total number of grievances, including the following criteria:


Data Sources:

*

2.e

RSC-6.a

RSC-6.a: Includes all grievances with a date of decision that occurs during the reporting period, regardless of when the grievance was received or completed (i.e., organization notified member of its decision).

[Data Elements B-W]

Data Element B


Review Results:

 

2.e

RSC-6.a

 

Data Element C


Review Results:

 

2.e

RSC-6.a

 

Data Element D


Review Results:

 

2.e

RSC-6.a

 

Data Element E


Review Results:

 

2.e

RSC-6.a

 

Data Element F


Review Results:

 

2.e

RSC-6.a

 

Data Element G


Review Results:

 

2.e

RSC-6.a

 

Data Element H


Review Results:

 

2.e

RSC-6.a

 

Data Element I


Review Results:

 

2.e

RSC-6.a

 

Data Element J


Review Results:

 

2.e

RSC-6.a

 

Data Element K


Review Results:

 

2.e

RSC-6.a

 

Data Element L


Review Results:

 

2.e

RSC-6.a

 

Data Element M


Review Results:

 

2.e

RSC-6.a

 

Data Element N


Review Results:

 

2.e

RSC-6.a

 

Data Element O


Review Results:

 

2.e

RSC-6.a

 

Data Element P


Review Results:

 

2.e

RSC-6.a

 

Data Element Q


Review Results:

 

2.e

RSC-6.a

 

Data Element R


Review Results:

 

2.e

RSC-6.a

 

Data Element S


Review Results:

 

2.e

RSC-6.a

 

Data Element T


Review Results:

 

2.e

RSC-6.a

 

Data Element U


Review Results:

 

2.e

RSC-6.a

 

Data Element V


Review Results:

 

2.e

RSC-6.a

 

Data Element W


 

 

2.e

RSC-6.b

RSC-6.b: If a grievance contains multiple issues filed by a single complainant, each issue is calculated as a separate grievance.

Data Elements B-W]

Data Element B


Review Results:

 

2.e

RSC-6.b

 

Data Element C


Review Results:

 

2.e

RSC-6.b

 

Data Element D


Review Results:

 

2.e

RSC-6.b

 

Data Element E


Review Results:

 

2.e

RSC-6.b

 

Data Element F


Review Results:

 

2.e

RSC-6.b

 

Data Element G


Review Results:

 

2.e

RSC-6.b

 

Data Element H


Review Results:

 

2.e

RSC-6.b

 

Data Element I


Review Results:

 

2.e

RSC-6.b

 

Data Element J


Review Results:

 

2.e

RSC-6.b

 

Data Element K


Review Results:

 

2.e

RSC-6.b

 

Data Element L


Review Results:

 

2.e

RSC-6.b

 

Data Element M


Review Results:

 

2.e

RSC-6.b

 

Data Element N


Review Results:

 

2.e

RSC-6.b

 

Data Element O


Review Results:

 

2.e

RSC-6.b

 

Data Element P


Review Results:

 

2.e

RSC-6.b

 

Data Element Q


Review Results:

 

2.e

RSC-6.b

 

Data Element R


Review Results:

 

2.e

RSC-6.b

 

Data Element S


Review Results:

 

2.e

RSC-6.b

 

Data Element T


Review Results:

 

2.e

RSC-6.b

 

Data Element U


Review Results:

 

2.e

RSC-6.b

 

Data Element V


Review Results:

 

2.e

RSC-6.b

 

Data Element W


Review Results:

 

2.e

RSC-6.c

RSC-6.c: If a member files a grievance and then files a subsequent grievance on the same issue prior to the organization’s decision or deadline for decision notification (whichever is earlier), then the issue is counted as one grievance.

[Data Elements B-W]

Data Element B


Review Results:

 

2.e

RSC-6.c

 

Data Element C


Review Results:

 

2.e

RSC-6.c

 

Data Element D


Review Results:

 

2.e

RSC-6.c

 

Data Element E


Review Results:

 

2.e

RSC-6.c

 

Data Element F


Review Results:

 

2.e

RSC-6.c

 

Data Element G


Review Results:

 

2.e

RSC-6.c

 

Data Element H


Review Results:

 

2.e

RSC-6.c

 

Data Element I


Review Results:

 

2.e

RSC-6.c

 

Data Element J


Review Results:

 

2.e

RSC-6.c

 

Data Element K


Review Results:

 

2.e

RSC-6.c

 

Data Element L


Review Results:

 

2.e

RSC-6.c

 

Data Element M


Review Results:

 

2.e

RSC-6.c

 

Data Element N


Review Results:

 

2.e

RSC-6.c

 

Data Element O


Review Results:

 

2.e

RSC-6.c

 

Data Element P


Review Results:

 

2.e

RSC-6.c

 

Data Element Q


Review Results:

 

2.e

RSC-6.c

 

Data Element R


Review Results:

 

2.e

RSC-6.c

 

Data Element S


Review Results:

 

2.e

RSC-6.c

 

Data Element T


Review Results:

 

2.e

RSC-6.c

 

Data Element U


Review Results:

 

2.e

RSC-6.c

 

Data Element V


Review Results:

 

2.e

RSC-6.c

 

Data Element W


Review Results:

 

2.e

RSC-6.d

RSC-6.d: If a member files a grievance and then files a subsequent grievance on the same issue after the organization’s decision or deadline for decision notification (whichever is earlier), then the issue is counted as a separate grievance.

[Data Elements B-W]

Data Element B


Review Results:

 

2.e

RSC-6.d

 

Data Element C


Review Results:

 

2.e

RSC-6.d

 

Data Element D


Review Results:

 

2.e

RSC-6.d

 

Data Element E


Review Results:

 

2.e

RSC-6.d

 

Data Element F


Review Results:

 

2.e

RSC-6.d

 

Data Element G


Review Results:

 

2.e

RSC-6.d

 

Data Element H


Review Results:

 

2.e

RSC-6.d

 

Data Element I


Review Results:

 

2.e

RSC-6.d

 

Data Element J


Review Results:

 

2.e

RSC-6.d

 

Data Element K


Review Results:

 

2.e

RSC-6.d

 

Data Element L


Review Results:

 

2.e

RSC-6.d

 

Data Element M


Review Results:

 

2.e

RSC-6.d

 

Data Element N


Review Results:

 

2.e

RSC-6.d

 

Data Element O


Review Results:

 

2.e

RSC-6.d

 

Data Element P


Review Results:

 

2.e

RSC-6.d

 

Data Element Q


Review Results:

 

2.e

RSC-6.d

 

Data Element R


Review Results:

 

2.e

RSC-6.d

 

Data Element S


Review Results:

 

2.e

RSC-6.d

 

Data Element T


Review Results:

 

2.e

RSC-6.d

 

Data Element U


Review Results:

 

2.e

RSC-6.d

 

Data Element V


Review Results:

 

2.e

RSC-6.d

 

Data Element W


Review Results:

 

2.e

RSC-6.e

RSC-6.e: Includes all methods of grievance receipt (e.g., telephone, letter, fax, and in-person).

[Data Elements B-W]

Data Element B


Review Results:

 

2.e

RSC-6.e

 

Data Element C


Review Results:

 

2.e

RSC-6.e

 

Data Element D


Review Results:

 

2.e

RSC-6.e

 

Data Element E


Review Results:

 

2.e

RSC-6.e

 

Data Element F


Review Results:

 

2.e

RSC-6.e

 

Data Element G


Review Results:

 

2.e

RSC-6.e

 

Data Element H


Review Results:

 

2.e

RSC-6.e

 

Data Element I


Review Results:

 

2.e

RSC-6.e

 

Data Element J


Review Results:

 

2.e

RSC-6.e

 

Data Element K


Review Results:

 

2.e

RSC-6.e

 

Data Element L


Review Results:

 

2.e

RSC-6.e

 

Data Element M


Review Results:

 

2.e

RSC-6.e

 

Data Element N


Review Results:

 

2.e

RSC-6.e

 

Data Element O


Review Results:

 

2.e

RSC-6.e

 

Data Element P


Review Results:

 

2.e

RSC-6.e

 

Data Element Q


Review Results:

 

2.e

RSC-6.e

 

Data Element R


Review Results:

 

2.e

RSC-6.e

 

Data Element S


Review Results:

 

2.e

RSC-6.e

 

Data Element T


Review Results:

 

2.e

RSC-6.e

 

Data Element U


Review Results:

 

2.e

RSC-6.e

 

Data Element V


Review Results:

 

2.e

RSC-6.e

 

Data Element W


Review Results:

 

2.e

RSC-6.f

RSC-6.f: Includes all grievances regardless of who filed the grievance (e.g., member or appointed representative).

[Data Elements B-W]

Data Element B


Review Results:

 

2.e

RSC-6.f

 

Data Element C


Review Results:

 

2.e

RSC-6.f

 

Data Element D


Review Results:

 

2.e

RSC-6.f

 

Data Element E


Review Results:

 

2.e

RSC-6.f

 

Data Element F


Review Results:

 

2.e

RSC-6.f

 

Data Element G


Review Results:

 

2.e

RSC-6.f

 

Data Element H


Review Results:

 

2.e

RSC-6.f

 

Data Element I


Review Results:

 

2.e

RSC-6.f

 

Data Element J


Review Results:

 

2.e

RSC-6.f

 

Data Element K


Review Results:

 

2.e

RSC-6.f

 

Data Element L


Review Results:

 

2.e

RSC-6.f

 

Data Element M


Review Results:

 

2.e

RSC-6.f

 

Data Element N


Review Results:

 

2.e

RSC-6.f

 

Data Element O


Review Results:

 

2.e

RSC-6.f

 

Data Element P


Review Results:

 

2.e

RSC-6.f

 

Data Element Q


Review Results:

 

2.e

RSC-6.f

 

Data Element R


Review Results:

 

2.e

RSC-6.f

 

Data Element S


Review Results:

 

2.e

RSC-6.f

 

Data Element T


Review Results:

 

2.e

RSC-6.f

 

Data Element U


Review Results:

 

2.e

RSC-6.f

 

Data Element V


Review Results:

 

2.e

RSC-6.f

 

Data Element W


Review Results:

 

2.e

RSC-6.g

RSC-6g: Excludes complaints received only by 1-800 Medicare or recorded only in the CMS Complaint Tracking Module (CTM); however, complaints filed separately as grievances with the organization are included.

[Data Elements B-W]

Data Element B


Review Results:

 

2.e

RSC-6.g

 

Data Element C


Review Results:

 

2.e

RSC-6.g

 

Data Element D


Review Results:

 

2.e

RSC-6.g

 

Data Element E


Review Results:

 

2.e

RSC-6.g

 

Data Element F


Review Results:

 

2.e

RSC-6.g

 

Data Element G


Review Results:

 

2.e

RSC-6.g

 

Data Element H


Review Results:

 

2.e

RSC-6.g

 

Data Element I


Review Results:

 

2.e

RSC-6.g

 

Data Element J


Review Results:

 

2.e

RSC-6.g

 

Data Element K


Review Results:

 

2.e

RSC-6.g

 

Data Element L


Review Results:

 

2.e

RSC-6.g

 

Data Element M


Review Results:

 

2.e

RSC-6.g

 

Data Element N


Review Results:

 

2.e

RSC-6.g

 

Data Element O


Review Results:

 

2.e

RSC-6.g

 

Data Element P


Review Results:

 

2.e

RSC-6.g

 

Data Element Q


Review Results:

 

2.e

RSC-6.g

 

Data Element R


Review Results:

 

2.e

RSC-6.g

 

Data Element S


Review Results:

 

2.e

RSC-6.g

 

Data Element T


Review Results:

 

2.e

RSC-6.g

 

Data Element U


Review Results:

 

2.e

RSC-6.g

 

Data Element V


Review Results:

 

2.e

RSC-6.g

 

Data Element W


Review Results:

 

2.e

RSC-6.h

RSC-6.h: Excludes withdrawn Part D grievances.
[Data Elements B-W]


Data Sources:

*

2.e

RSC-6.h

 

Data Element B


Review Results:

 

2.e

RSC-6.h

 

Data Element C


Review Results:

 

2.e

RSC-6.h

 

Data Element D


Review Results:

 

2.e

RSC-6.h

 

Data Element E


Review Results:

 

2.e

RSC-6.h

 

Data Element F


Review Results:

 

2.e

RSC-6.h

 

Data Element G


Review Results:

 

2.e

RSC-6.h

 

Data Element H


Review Results:

 

2.e

RSC-6.h

 

Data Element I


Review Results:

 

2.e

RSC-6.h

 

Data Element J


Review Results:

 

2.e

RSC-6.h

 

Data Element K


Review Results:

 

2.e

RSC-6.h

 

Data Element L


Review Results:

 

2.e

RSC-6.h

 

Data Element M


Review Results:

 

2.e

RSC-6.h

 

Data Element N


Review Results:

 

2.e

RSC-6.h

 

Data Element O


Review Results:

 

2.e

RSC-6.h

 

Data Element P


Review Results:

 

2.e

RSC-6.h

 

Data Element Q


Review Results:

 

2.e

RSC-6.h

 

Data Element R


Review Results:

 

2.e

RSC-6.h

 

Data Element S


Review Results:

 

2.e

RSC-6.h

 

Data Element T


Review Results:

 

2.e

RSC-6.h

 

Data Element U


Review Results:

 

2.e

RSC-6.h

 

Data Element V


Review Results:

 

2.e

RSC-6.h

 

Data Element W


Review Results:

 

2.e

RSC-6.i

RSC-6.i: For MA-PD contracts: Includes only grievances that apply to the Part D benefit and were processed through the Part D grievance process. If a clear distinction cannot be made for an MA-PD, cases are calculated as Part C grievances. [Data Elements B-W]


Data Sources:

*

2.e

RSC-6.i

 

Data Element B


Review Results:

 

2.e

RSC-6.i

 

Data Element C


Review Results:

 

2.e

RSC-6.i

 

Data Element D


Review Results:

 

2.e

RSC-6.i

 

Data Element E


Review Results:

 

2.e

RSC-6.i

 

Data Element F


Review Results:

 

2.e

RSC-6.i

 

Data Element G


Review Results:

 

2.e

RSC-6.i

 

Data Element H


Review Results:

 

2.e

RSC-6.i

 

Data Element I


Review Results:

 

2.e

RSC-6.i

 

Data Element J


Review Results:

 

2.e

RSC-6.i

 

Data Element K


Review Results:

 

2.e

RSC-6.i

 

Data Element L


Review Results:

 

2.e

RSC-6.i

 

Data Element M


Review Results:

 

2.e

RSC-6.i

 

Data Element N


Review Results:

 

2.e

RSC-6.i

 

Data Element O


Review Results:

 

2.e

RSC-6.i

 

Data Element P


Review Results:

 

2.e

RSC-6.i

 

Data Element Q


Review Results:

 

2.e

RSC-6.i

 

Data Element R


Review Results:

 

2.e

RSC-6.i

 

Data Element S


Review Results:

 

2.e

RSC-6.i

 

Data Element T


Review Results:

 

2.e

RSC-6.i

 

Data Element U


Review Results:

 

2.e

RSC-6.i

 

Data Element V


Review Results:

 

2.e

RSC-6.i

 

Data Element W


Review Results:

 

2.e

RSC-6.j

RSC-6.j: Counts grievances for the contract to which the member belongs at the time the grievance is resolved, regardless of where the grievance originated (e.g., if a grievance is resolved within the reporting period for a member that has disenrolled from a plan and enrolled in a new plan, then the member’s new plan should report the grievance regardless of where the grievance originated, if they actually resolve the grievance.)

[Data Elements B-W]


Data Sources:

*

2.e

RSC-6.j

 

Data Element B


Review Results:

 

2.e

RSC-6.j

 

Data Element C


Review Results:

 

2.e

RSC-6.j

 

Data Element D


Review Results:

 

2.e

RSC-6.j

 

Data Element E


Review Results:

 

2.e

RSC-6.j

 

Data Element F


Review Results:

 

2.e

RSC-6.j

 

Data Element G


Review Results:

 

2.e

RSC-6.j

 

Data Element H


Review Results:

 

2.e

RSC-6.j

 

Data Element I


Review Results:

 

2.e

RSC-6.j

 

Data Element J


Review Results:

 

2.e

RSC-6.j

 

Data Element K


Review Results:

 

2.e

RSC-6.j

 

Data Element L


Review Results:

 

2.e

RSC-6.j

 

Data Element M


Review Results:

 

2.e

RSC-6.j

 

Data Element N


Review Results:

 

2.e

RSC-6.j

 

Data Element O


Review Results:

 

2.e

RSC-6.j

 

Data Element P


Review Results:

 

2.e

RSC-6.j

 

Data Element Q


Review Results:

 

2.e

RSC-6.j

 

Data Element R


Review Results:

 

2.e

RSC-6.j

 

Data Element S


Review Results:

 

2.e

RSC-6.j

 

Data Element T


Review Results:

 

2.e

RSC-6.j

 

Data Element U


Review Results:

 

2.e

RSC-6.j

 

Data Element V


Review Results:

 

2.e

RSC-6.j

 

Data Element W


Review Results:

 



Standard 3 contains two Sub-Standards. Sub-Standard 3.a requires the reviewer to assess data element-level findings and Sub-Standard 3.b requires reporting section-level findings. Sub-Standard 3.a is assessed at the data element-level for reporting sections that CMS requires to be manually entered into the HPMS Plan Reporting Module because it confirms that there were no manual data entry errors for each data element, and for reporting sections that are reported as file uploads, it confirms at the sub-standard level that the SO used the correct file layout. Exhibit 24 illustrates an example of the FDCF for Standard 3 for the Part D Grievances reporting section.


Exhibit 24 Example Rows from FDCF for Standard 3 for Part D Grievances Reporting Section

Standard/Sub-standard ID

Reporting Section Criteria ID

Standard/Sub-standard Description

 

Data Sources and Review Results:

Enter review results and/or data sources

Enter 'Findings' using the applicable choice in the appropriate cells. Cells marked with an '*' should not be edited.


 

Organization implements policies and procedures for data submission, including the following:


Data Sources:

*

3.a

 

Data elements are accurately entered/uploaded into CMS systems and entries match corresponding source documents.

Data Element A


Review Results:

 

3.a

 

 

Data Element B


Review Results:

 

3.a

 

 

Data Element C


Review Results:

 

3.a

 

 

Data Element D


Review Results:

 

3.a

 

 

Data Element E


Review Results:

 

3.a

 

 

Data Element F


Review Results:

 

3.a

 

 

Data Element G


Review Results:

 

3.a

 

 

Data Element H


Review Results:

 

3.a

 

 

Data Element I


Review Results:

 

3.a

 

 

Data Element J


Review Results:

 

3.a

 

 

Data Element K


Review Results:

 

3.a

 

 

Data Element L


Review Results:

 

3.a

 

 

Data Element M


Review Results:

 

3.a

 

 

Data Element N


Review Results:

 

3.a

 

 

Data Element O


Review Results:

 

3.a

 

 

Data Element P


Review Results:

 

3.a

 

 

Data Element Q


Review Results:

 

3.a

 

 

Data Element R


Review Results:

 

3.a

 

 

Data Element S


Review Results:

 

3.a

 

 

Data Element T


Review Results:

 

3.a

 

 

Data Element U


Review Results:

 

3.a

 

 

Data Element V


Review Results:

 

3.a

 

 

Data Element W


Review Results:

 

3.b

 

All source, intermediate, and final stage data sets and other outputs relied upon to enter data into CMS systems are archived.


Review Results:

 

Standards 4 through 7 assesses policies and procedures for periodic data system updates; an SO will most likely have these policies and procedures in place for an entire reporting section, as opposed to having them in place for only certain data elements. Exhibit 25 displays example rows from the FDCF for Standards 4 through 7.

Exhibit 25 Example Rows from FDCF for Standards 4 through 7

Standard/Sub-standard ID

Reporting Section Criteria ID

Standard/Sub-standard Description

 

Data Sources and Review Results: Enter review results and/or data sources

Enter 'Findings' using the applicable choice in the appropriate cells. Cells marked with an '*' should not be edited.

4

 

Organization implements policies and procedures for periodic data system updates (e.g., changes in enrollment, provider/pharmacy status, and claims adjustments).

 


Review Results:

 

5

 

Organization implements policies and procedures for archiving and restoring data in each data system (e.g., disaster recovery plan).

 


Review Results:

 

6

 

If organization’s data systems underwent any changes during the reporting period (e.g., as a result of a merger, acquisition, or upgrade): Organization provided documentation on the data system changes and, upon review, there were no issues that adversely impacted data reported.

 


Review Results:

 

7

 

If data collection and/or reporting for this reporting section is delegated to another entity: Organization regularly monitors the quality and timeliness of the data collected and/or reported by the delegated entity or first tier/downstream contractor.

 

 

Review Results:

 


      1. Guidance for Interpreting Standards and Making a Findings Determination

In order to ensure consistency with the review process, CMS has provided below a description of the data sources and criteria that reviewers must use to determine findings for each of the DV standards.

Standard 1

This validation standard is assessed at the reporting section-level and is used to determine that all source documents accurately capture required data fields and are properly documented. The guidance for evaluating Standard 1 is described below.


Exhibit 26 Guidance for Standard 1

Data Validation Standard 1:

Assessed at the reporting section-level and is used to determine that all source documents accurately capture required data fields and are properly documented

Criteria

Guidance

A review of source documents (e.g., programming code, spreadsheet formulas, analysis plans, saved data queries, file layouts, process flows) indicates that all source documents accurately capture required data fields and are properly documented.


Criteria for Validating Source Documents (Sub-Standards):

a) Source documents are properly secured so that source documents can be retrieved at any time to validate the information submitted to CMS via CMS systems.

b) Source documents create all required data fields for reporting requirements.

c) Source documents are error-free (e.g., programming code and spreadsheet formulas have no messages or warnings indicating errors, use correct fields, have appropriate data selection, etc.).

d) All data fields have meaningful, consistent labels (e.g., label field for patient ID as Patient_ID, rather than Field1 and maintain the same field name across data sets).

e) Data file locations are referenced correctly f) If used, macros are properly documented.

g) Source documents are clearly and adequately documented.

h) Titles and footnotes on reports and tables are accurate.

Version control of source documents is appropriately applied.

Determine if the SOs source documents (e.g., programming code, spreadsheet formulas, analysis plans, saved data queries, file layouts, process flows) accurately capture the data fields required for each reporting section under review and are documented with the necessary detail and information to create data file sets and other outputs.


Ensure that all source documentation is legible, descriptive, and understandable, including each of the following:

  • Standard Operating Procedures (SOPs) include detailed workflows and processes related to managing, producing, and tracking source documents.

  • Titles and footnotes used in programs and reported output are legible and correspond to HPMS reports and tables

  • SOPs, file-naming conventions, dates of source documents and output reports reflect application of version control

  • Data file locations are referenced correctly within source code (i.e., these files can be located using the references that exist within the source code).

  • Dated HPMS entries match the source document(s) used to create the data entered into HPMS.


Ensure that the data validation reviewer is using the documentation that is current and relevant to the time period of the reporting requirements.


Please note that Standards 1 and 2 should be addressed concurrently given that an evaluation of source documents directly impacts the quality of the actual data and vice versa (that elements for each reporting section are accurately identified, processed, and calculated). For example, the reviewer should ensure that all source documentation (file layouts, data dictionaries, programming code, work instructions, SOPs, etc.) is available and allows for the complete validation for each reporting sections validation.


Standard 2


This validation standard assesses whether the data elements for each reporting section are accurately identified, processed, and calculated. Each reviewer should ensure that it has staff fluent in the programming language (SQL, SAS, Microsoft VBA) used by the SO. The guidance for evaluating Standard 2 is described below.


Since the DV reviews must be conducted at the contract level, for the reporting sections that require reporting at the plan benefit package (PBP)-level, if the reviewer finds that the SO incorrectly identified, processed, or calculated the data reported for any of the PBPs included under a contract, then the reviewer must assign a “Nofinding in the FDCF for the entire contract for the applicable sub-standard or data element (for Sub-Standard 2.e).


While careful inspection of the source code should detect most errors in the reported data, a careful review of the census or sample data gathered from the SO will minimize the chance that a programming error was undetected by the reviewer. Many of the same items that will be checked in reviewing the source code can also be checked by analyzing the extracted data sets.



Exhibit 27 Guidance for Standard 2

Data Validation Standard 2:

Assesses whether the data elements for each reporting section are accurately identified, processed, and calculated. Each data validation reviewer should ensure that it has staff fluent in the programming language (SQL, SAS, Microsoft VBA) used by the SO.


Criteria

Guidance


A review of source documents (e.g., programming code, spreadsheet formulas, analysis plans, saved data queries, file layouts, process flows) and census or sample data, whichever is applicable, indicates that data elements for each reporting section are accurately identified, processed, and calculated.


Criteria for Validating Reporting Section Criteria (Refer to reporting section criteria section below):

a) The appropriate date range(s) for the reporting period(s) is captured.

b) Data are assigned at the applicable level (e.g., plan benefit package or contract level).

c) Appropriate deadlines are met for reporting data (e.g., quarterly).

d) Terms used are properly defined per CMS regulations, guidance and Reporting Requirements Technical Specifications.

e) The number of expected counts (e.g., number of members, claims, grievances, procedures) are verified; ranges of data fields are verified; all calculations (e.g., derived data fields) are verified; missing data has been properly addressed; reporting output matches corresponding source documents (e.g., programming code, saved queries, analysis plans); version control of reported data elements is appropriately applied; QA checks/thresholds are applied to detect outlier or erroneous data prior to data submission.

(Sub-Standard 2a 2d)

Assess the programming code to determine if the data was extracted from the system properly and if the calculations used in reporting data to CMS are accurate according to the reporting section criteria applicable to each reporting section under review.


A thorough review of source code must examine every line of code to ensure the following for each reporting section under review:

  • Data is extracted from the appropriate source system: Verify that all data sets found in the programming code can be traced back to the appropriate source data sets.

  • Data sets are filtered correctly: Verify that data inclusion and exclusion criteria were applied according to the reporting section criteria.

  • For example, proper inclusion of records would ensure that source code indicates that only those records falling within the reporting period date range are included in the reported data. An example of correct exclusion would document source code that indicates beneficiaries are not eligible for a particular benefit (e.g., Medication Therapy Management Program).

  • Individual data sets are joined or merged correctly (this is especially important when moving data from source data sets to intermediate data sets): Verify that the correct key data field was used to generate the new data set and that the correct type of join (or data merge) was used to avoid creating duplicate records or improperly combining records from various data sets.

  • Data set progression is accurate: Verify that required data fields in both the source and final stage files allow for file comparison and understanding of data production from source system through the final stage file.

    • If full census data is not extracted, verify that the sample size is sufficient and representative of the population of interest.

  • While the Data Extraction and Sampling Instructions provide minimum sample sizes, reviewers often will need larger data sets to check for errors that occur infrequently. Statisticians should rely on standard statistical practices when determining the proper sample size so that any estimates generated are statistically significant.

  • All data elements are accurate: Verify that each data element is consistent with the reporting section criteria.




(Sub-Standard 2c)

Assess the Submission Activity Report from the HPMS Plan Reporting Module to determine if appropriate deadlines were met for reporting data by performing the following:


  • Request a copy of the contract’s Submission Activity Report from the SO: This report displays information about the original submission and all subsequent resubmissions for a particular contract or contracts. The report also displays Reporting Period, Contract Number, Plan ID, Submission Version, Due Date and Date Submitted for each section.

  • Determine if the SO has, for any reason, re-submitted its data to CMS for a reporting section: The data validation reviewer should verify that the SO’s original submission(s) met the CMS deadline.

    • If the deadline was met, the reviewer must assess a Yes finding for this reporting section criterion. However, if an SO re-submits data for any reason and if the re-submission was completed by March 313 of the calendar year of the data validation review (i.e., immediately prior to the data validation review timeframe), the data validation reviewer should use the SO’s corrected data submission for performing the validation, not the original data. The March 31st deadline will give the reviewer enough time to include the corrected data in the scope of its review of data and determination of findings.

    • If the SO received CMS permission to submit data after the reporting deadline (i.e., its first submission), the reviewer must request that the SO show proof that it requested and was granted

an extension by CMS. If this proof is valid, then the reviewer should consider the deadline as being met, and assess a Yes” finding for this reporting section criterion.

    • For either of the above scenarios, the reviewer must clearly document the circumstances in the Data Sources and Review Results section of the FDCF.



(Sub-Standard 2e)

Assess the census/sample data provided by the SO to determine each of the following for each reporting section under review:

  • Data records are selected properly:

    • Perform frequency calculations to list all unique occurrences of data fields pertinent to the calculation of the reporting section to verify they contain values within an acceptable range for the data field.

    • Calculating frequency of occurrence for certain data fields might also alert the reviewer to obvious mistakes in the data extraction.

    • Verify that data has been selected at the proper level (e.g., either the contract or the plan benefit package level).

    • Check date ranges, demographic information, and eligibility information to examine proper data filtering.

  • Individual data sets are joined or merged correctly:

    • Sample a few records, when individual data sets are available (most likely for intermediate data sets), from the individual data sets to confirm that they were joined properly.

    • Check for duplicate records and determine if record counts for the component data sets agree with those found in the merged data set.

  • All data elements are calculated accurately:

    • Recalculate the data fields that the SO used to calculate the data elements and refer to the reporting section criteria for each reporting section.

    • Calculate sums of the individual records within each reporting section to ensure that they equal those reported to CMS.

Verify that the calculation of each of the data elements is consistent

with the reporting section criteria*.

* CMS has added a new reporting section criteria (RSC #5) which will be used by the reviewers to confirm that the data does not have any logical errors. RSC #5 includes data integrity checks. The reviewer must verify them at the data element level. The checks include confirming that a data element does not include outlier records [for e.g. in case of Part C Organization Determination and Reconsideration, RSC 5.g checks if the date of disposition for each reopening (Data Element 6.30) is after the date of the original disposition (Data Element 6.26)] and confirming that the data reported is valid [for e.g. in case of Part C Organization Determination and Reconsideration, confirming that there is a valid value submitted for reopening disposition (Data Element 6.31) where the valid choices are Fully Favorable; Partially Favorable, Adverse or Pending.


Exhibit 26 provides several examples of how to review source code and evaluate the integrity of the data. However, the reviewer may use other methods of DV to ensure a comprehensive and complete review of the source code and census/sample data. The reviewer must clearly document all errors found in programming code, referring to the program examined, the precise location in the program, the nature of the error, and the impact of the error in the “Data Sources and Review Results” section of the FDCF. Likewise, any evidence from the review of census/sample data that leads to a negative finding must be clearly documented in the applicable section of the FDCF.

Standard 3

This validation standard assesses whether the SO implements policies and procedures for entering and/or uploading each data submission to CMS systems. The guidance for evaluating Standard 3 is described in Exhibit 28.

Exhibit 28 Guidance for Standard 3

Data Validation Standard 3:

Assesses whether the SO implements policies and procedures for entering or uploading each data submission to CMS systems.

CRITERIA

Guidance

Organization implements policies and procedures for data submission, including the following:

a) Data elements are accurately entered / uploaded into CMS systems and entries match corresponding source documents.

b) All source, intermediate, final stage data sets and other outputs relied upon to enter data into CMS systems are archived.

(Sub-Standard 3a)

Determine who is responsible for entering/uploading data into CMS systems for each reporting section under review and if the SO has written work instructions or policies and procedures for the entry or submission of the Part C and Part D Reporting Requirements.


Evaluate Sub-Standard 3a by performing the following actions:

  • Compare the data file created for submission to CMS with a copy of the HPMS screen shots of data entered to confirm there were no manual data entry errors.

  • For file uploads, confirm that the data file adheres to the record layout specified in the applicable Technical Specifications document.

  • For the reporting sections that require reporting at the plan benefit package (PBP)-level, if the reviewer finds that the SO did not accurately enter and/or upload data reported for any of the PBPs included under a contract, then the reviewer must assign a “No” finding in the FDCF for the entire contract for the applicable data element(s) for Sub- Standard 3a.

  • If a reporting section requires both a file upload and data entry, both have to occur in order for a SO to meet Sub- Standard 3a.


(Sub-Standard 3b)

Determine if the SO has a policy or procedure for archiving all source, intermediate, and final stage data sets relied upon to enter data into CMS systems, and confirm that the SO implemented this policy for the reporting section under review.





Standard 4


This validation standard is assessed at the reporting section-level and is used to assess whether the SO has and implements policies and procedures for regular database updates. The data sources and criteria for evaluating Standard 4 are described in Exhibit 29.


Exhibit 29 GUIDANCE FOR STANDARD 4

Data Validation Standard 4:

Assessed at the reporting section-level and is used to assess whether the SO has and implements policies and procedures for regular database updates.

Criteria

Guidance

Organization implements policies and procedures for periodic data system updates (e.g., changes in enrollment, provider/pharmacy status, claims adjustments).

Determine if the SO has policies and procedures in place for performing periodic updates for each data system used for the reporting section under review that ensures reported data are accurate and timely.


Determine if the SO implements and adheres to the policies and procedures referenced above (i.e., was any data for the reporting section under review negatively impacted by a failure to implement or follow these policies and procedures?).



Standard 5


This validation standard is assessed at the reporting section-level and is used to assess whether the SO has and implements policies and procedures for data archiving and restoration. The data sources and criteria for evaluating Standard 5 are described in Exhibit 30.


Exhibit 30 Guidance for Standard 5

Data Validation Standard 5:

Assessed at the reporting section-level and is used to assess whether the SO has and implements policies and procedures for data archiving and restoration

Criteria

Source

Organization implements policies and procedures for archiving and restoring data in each data system (e.g., disaster recovery plan).

Determine if the SO has policies and procedures in place for archiving and restoring data in each data system used for the reporting section under review that ensures timely data submission or re-submission in the event of data loss.


Determine if the SO implements and adheres to the policies and procedures referenced above (i.e., was any data for the reporting section under review negatively impacted by a failure to implement or follow these policies and procedures?).


Standard 6


This validation standard is assessed at the reporting section-level and is used to assess whether the validity of the SO’s data was adversely impacted by any changes to data systems during the reporting period. The data sources and criteria for evaluating Standard 6 are described in Exhibit 31.


Standard 6 applies if an SO’s data systems underwent any changes during the reporting period. The DVC should mark “Not Applicable” in the FDCF if Standard 6 is not applicable to the contract under review.

Exhibit 31 Guidance for Standard 6

Data Validation Standard 6:

Assessed at the reporting section-level and is used to assess whether the validity of the SO’s data was adversely impacted by any changes to data systems during the reporting period.

Criteria

Guidance

If organization’s data systems underwent any changes during the reporting period (e.g., as a result of a merger, acquisition, or upgrade): Organization provided documentation on the data system changes and, upon review, there were no issues that adversely impacted data reported.

Review documentation on data system changes and determine if changes to an SOs data system adversely impacted data reported.by conducting the following activities:

  • Determine if there were any changes to data sources used for data collection and storage, data processing, analysis, and reporting for the reporting section under review.

  • Determine if data system changes were the root cause of any outlier notices received from CMS for the reporting section under review.

  • Determine if the SO implemented any process or quality improvement activities during the reporting period specifically related to the data system change for the reporting section under review.


Determine if the validity of the SO’s data was adversely impacted by any changes to data systems during the reporting period.



Standard 7


This validation standard is assessed at the reporting section-level and is used to assess whether the SO routinely monitors the quality of a delegated entity’s work and processes related to the reporting requirements. The data sources and criteria for evaluating Standard 7 are described in Exhibit 32.

Standard 7 applies if any of the data collection or validation processes are outsourced to another entity. The reviewer should mark “Not Applicable” in the FDCF if Standard 7 is not applicable to the reporting section or contract under review.

Exhibit 32 Guidance for Standard 7

Data Validation Standard 7:

Assessed at the reporting section-level and is used to assess whether the SO routinely monitors the quality of a delegated

entity’s work and processes related to the reporting requirements.

Criteria

Guidance

If data collection and/or reporting for this reporting section are delegated to another entity: Organization regularly monitors the quality and timeliness of the data collected and/or reported by the delegated entity or first tier/downstream reviewer.

Assess the following if data collection and/or reporting for a reporting section is delegated to another entity:

  • Determine if the SO has policies and procedures in place for overseeing the delegated entity’s reporting process / results for the reporting section under review.

  • Determine if the SO implements and adheres to the policies and procedures referenced above (i.e., was any data for the reporting section under review negatively impacted by a failure to implement or follow these policies and procedures?).


Plans are not expected to replicate the delegated entities process and recalculate all of their numbers but are expected to have policies and procedures in place for routine monitoring. It is expected that these policies and procedures are implemented as frequently as needed to verify the delegated entities’ reporting.


SOs are responsible for a delegated entities calculations and numbers and therefore if they are incorrect, the responsibility ultimately falls on the SO.



5.2 Provide Draft Findings to Sponsoring Organization


Once the findings have been documented in the FDCF, the reviewer must share the draft findings with the SO.

When the DV reviewer uploads the Microsoft Word version of the FDCF into the PRDVM during its review, they may print the findings uploaded into the PRDVM and share them with the SO at any point during the review by accessing the PRDVM report entitled “Review Data Validation Findings Report.”


5.3 Review Draft Findings with Sponsoring Organization and Obtain Additional

Documentation Necessary to Resolve Issues


The SO and DV reviewer should build time into the April-June DV schedule to allow sufficient review of the findings. Any issues identified during this review must be resolved prior to the data validation reviewer’s June 30 deadline for submitting findings to CMS.


Following any review of the draft findings with the SO, the reviewer must update the FDCF with any necessary revisions. This final version will be used to report the results of the data validation review to CMS.


5.4. Submit Data Validation Review Findings via HPMS PRDVM


      1. Data Validation Contractor’s Submission of Findings


Following the conclusion of the DV review and the finalization of findings, the reviewer must report the findings by uploading the FDCF to CMS via the PRDVM in HPMS by June 30. Instructions for using this module are contained in the PRDVM Quick Reference Guide, which is available in the PRDVM. The FDCF includes review results and/or data sources that were reviewed for each standard or sub-standard, as well as the Yes, No, or Not Applicable finding associated with each standard or sub-standard. Reviewers should also indicate which extraction method (full census or sample) was used for each standard.


      1. Sponsoring Organization Disagreement with Findings


If the SO disagrees with any of the findings submitted by the DV reviewer, it may submit information indicating this disagreement to CMS within 30 calendar days of the date that final findings are submitted via the PRDVM. Submissions should be sent to CMS via the [email protected] email box and should contain all of the following information in order to be considered for review:

  • Email subject line must state: “Data Validation: Reported Findings Discrepancy

  • Content of email must include the information below, in list format and in the following order:

    • Name of SO

    • CMS contract number(s)

    • SOs contact name, title, phone number and email address

    • Name of reviewer organization

  • For each area of discrepancy, list the following information:

    • Part C or Part D, name of reporting section

    • Standard/ sub-standard ID, reporting section criteria ID

    • Description of reviewers finding

    • Reason for disagreement with finding

    • Steps that were taken to resolve the disagreement with the reviewer prior to the submission of the finding

    • Outcome of discussions, areas of impasse, and any additional information


CMS will review any findings disagreements on a case by case basis.



6 POST- DATA VALIDATION ACTIVITIES



6.1 Compile Archive of Data Validation Work Papers



The DV reviewer must prepare a complete archive of work papers associated with the annual DV and provide it to the SO. At a minimum, this archive must contain the documentation described in Exhibit 32. The reviewer should also retain a complete copy of this archive in accordance with its contract with the SO.

When the SO receives the archive from the reviewer, the SO must add the documentation of its reviewer selection process to the archive, including how its chosen reviewer meets the minimum qualifications, credentials, and resources set forth in the Standards for Selecting a Data Validation Contractor. The SO must retain this complete archive for the 10-year retention period required per federal regulations and be prepared to provide the archive to CMS upon request.

Exhibit 33 Minimum Documentation Required For Data Validation Archive

DATA VALIDATION ARCHIVE


  • Documentation of Data Validation Contractor Selection Process

  • Documentation of completion of CMS Data Validation Training for all staff assigned to the data validation team

  • Completed OAI, including all documentation provided in response to OAI Section 5

  • Final Site Visit Agenda

  • Completed Sign-in Sheets from site visit (if used)

  • Final IDG used during site visit

  • Copies of any formal presentations during site visit

  • Notes on staff interviews and demonstrations during site visit

  • Census/sample data

  • Additional documentation provided by SO during/after site visit

  • Draft findings in FDCF Notes on issues resulting in changes to draft findings

  • Final FDCF


6.2 Receive Pass or Not Pass Threshold Level and Assess Pass or Not Pass

Determination based on Final Scores

6.2.1 Pass/Not Pass Determination


For each reporting section, CMS has assigned a score to each of the standards or sub-standards. CMS assigns a score based on the findings uploaded into the PRDVM by the DV reviewer. A standard or sub- standard receiving a “Yes” finding will receive the points assigned to that standard or sub-standard, while a “No” finding will result in zero points being assigned to the standard or sub-standard. The Data Validation Pass/Not Pass Determination Methodology (Appendix K) identifies the individual score CMS has assigned to each standard and sub-standard for all reporting sections.


After all findings are submitted to CMS, CMS will calculate a percentage score for all Part C reporting sections as a group, all Part D reporting sections as a group, and a combined Part C and Part D determination for those contracts reporting both Part C and Part D data. CMS then establishes passing thresholds for Part C, Part D, and an overall combined Part C/Part D score based on the distribution of scores.



6.2.2 CMS Notification to Sponsoring Organization of Pass/Not Pass Determinations

CMS will release a memo through HPMS PRDVM regarding the thresholds established. SOs then determine if they passed or did not pass by comparing their score received via HPMS against the threshold announced in the memo. If an SO does not pass, they will receive follow-up communication from CMS.


6.3 Sponsoring Organization Appeal of Data Validation Determination (If Applicable)


An SO has the right to appeal any Not Pass determination(s) it receives for the Part C and/or Part D reporting sections or for the overall combined Part C and Part D determination. Please note that the pass/not pass thresholds are not applied to individual reporting sections.


If the SO wishes to appeal a Not Pass determination, it must submit an appeal to CMS within 5 business days of receiving information from CMS about the threshold level. Submissions must be sent to CMS via the [email protected] email box and must contain all of the following information in order to be considered.

  • Email subject line must state: “Data Validation: Appeal of Not Pass Determination”

  • Content of email must include the information below, in list format and in the following order:

    • Name of SO

    • CMS contract number(s)

    • SOs contact name, title, phone number and email address

    • Name of reviewer organization

  • For each Not Pass determination included in the appeal, list the following information:

    • Indicate whether the appeal pertains to the overall Not Pass for Part C and/or Part D reporting sections

    • CMS contract number(s) that received the subject Not Pass determination

    • Justification for appeal

    • Include as attachment any documentation supporting the justification for appeal. The

documentation must have been in existence at the time of the DV. For example, if after the

DV, the SO resubmits corrected data, revises a policy and procedure, or corrects a programming code that caused it to improperly calculate reported data; the SO cannot submit documentation of these corrections to appeal a Not Pass determination.


Once the appeal is received, CMS will carefully consider the justification and any supporting documentation to determine if the Not Pass determination should be changed to a Pass determination. CMS has not established a timeframe for the consideration of SO appeals.

1 See 42 CFR §422.516(g) and §423.514(g)

2 See 42 CFR §422.504(d) and § 423.505(d)

3 These instructions should not discourage SOs from re-submitting corrected data to CMS if necessary; however, re- submissions after March 31st will not be included in the scope of the DV review and will not change a reviewers “No finding or a CMS determination of Not Pass.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDV Procedure Manual v6
SubjectMedicare Drug Benefit and C&D Data Group
AuthorCMS
File Modified0000-00-00
File Created2021-01-23

© 2024 OMB.report | Privacy Policy