PQRS Data Validation Electronic Survey - Supporting Statement - Part B clean

PQRS Data Validation Electronic Survey - Supporting Statement - Part B clean.docx

(CMS-10519) Physician Quality Reporting System and the Electronic Prescribing Incentive Program Data Assessment, Accuracy and Improper Payments Identification Support

OMB: 0938-1255

Document [docx]
Download: docx | pdf

Supporting Statement - Part B

Collections of Information Employing Statistical Methods

1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

RESPONSE:

The Physician Quality Reporting System (PQRS) and Electronic Prescribing Incentive (eRx) Program Data Assessment, Accuracy and Incorrect Payments Identification Support contract was created to identify and address problems with data handling, data accuracy, and incorrect payments for the PQRS and eRx Programs.

Because the data submitted by, or on behalf of, eligible professionals (EPs) to the PQRS and eRx Programs is used to calculate incentive payments and payment adjustments, it is critical that this data is accurate. Additionally, the data is used to generate Feedback Reports for EPs and, in some cases, is posted publicly on the CMS website, further supporting the need for accurate and complete data.

The ultimate use of the clinical quality reporting data is to improve the quality of care for Medicare beneficiaries. This aligns with the CMS mission and helps to make healthcare more cost-effective and efficient.



To determine if data quality issues exist and if the incentive payments are correct, additional information is required. Surveys are one tool that will be used to collect this data, and they will be sent to the following reporting entities: Group Practices using the Group Practice Reporting Option (GPRO), Registries, Data Submission Vendors (DSVs), Eligible Professionals (EPs) using Electronic Health Records (EHRs), and Claims reporting.

The survey is completely automated and was designed with simplicity as a core requirement – it does not require a login and can be accessed via a link provided in a survey invitation email. There is no Protected Health Information (PHI) or Personally Identifiable Information (PII) submitted in the survey. In order to minimize the burden on the participant community, the number of questions in a survey will not exceed thirty-three. The majority of the questions in the survey are “point and click”, allowing the participant to complete the survey quickly. There is a Feedback section included in the survey, which allows for free-form text entry and document upload; however, document uploads are not required.



Base Year

Option Year 1

Option Year 2

Option Year 3

GPRO

NA

20

20

20

Registries

9

10

10

10

EHR Direct

NA

30

30

30

EHR DSV

NA

5

5

5

EPs submitting via Claims

NA

50

50

50

Total

55

115

115

115

Sampling, as it relates to this effort, will limit the number of entities that receive the survey and, consequently, the data examined to identify errors and incorrect payments made from the Physicians Quality Reporting System (PQRS) data. The projected samples by contract option year are provided in Table 1.


Table 1: Sampling Size Distribution

Methodology Tables 2 and 3 below provide a summary of the proposed sample sizes and the methods used for GPROs and Registries. Similar technique will be used for sampling other entities (EHR Direct, EHR DSV).

Table 2: Methodology Table 2 (GPROs)

Information Gathering Method

Web Survey

Entity

GPRO

Sample

Targeted selection of 20 GPROs

Methodology

Structured web-based survey with questions specific to data handling processes, training, and quality assurance.

Output

Narrative report with list of prioritized issues and recommended best practices related to data handling.


Table 3: Methodology Table 3 Registries


Information Gathering Method

Web Survey

Entity

Registry

Sample

Targeted selection of 10 Registries

Methodology

Structured web-based survey with questions specific to data handling processes, training, and quality assurance.

Output

Narrative report with a list of prioritized issues and recommended best practices related to data handling.


hdlking.calculations and handoffs


2. Describe the procedures for the collection of information, including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.



RESPONSE:

GPRO

We will survey 20 GPROs in this task. This number balances obtaining information from an adequate number of GPROs, the burden on providers associated with survey participation, and the resources required to execute this task.

GPRO Data Preparation

The following steps will support the sampling of GPROs:


  • Download from the CMS PQRS data store a GPRO PQRS XML submission extract, including identifiers for group (name, Taxpayer Identification Number (TIN) and National Provider Indicator (NPI)), individual providers (name and NPI), and beneficiaries (name and Health Insurance Claim Number (HICN)). Include the contact information (address, phone number, email) where available, and request it if not available.

  • Download from the CMS PQRS data store an eRx submission extract including identifiers for group (name, TIN and NPI) individual providers (name and NPI), and beneficiaries (name and HICN), including the contact information (address, phone number, email)

where available, and request it if not available.

  • Compile targeting information from CMS PQRS documents describing GPRO reporting experience and GPRO reporting error logs, to identify potential problem GPROs.

  • Merge PQRS and eRx submitter information by GPROs.

GPRO Sampling Methodology

We will construct a stratified sample frame from the entire list of GPROs submitting measures and eRx information during 2012. The sample will be stratified by targeted (75%) versus non-targeted submitters (25%). The sample will also be stratified by submitters with or without eRx participation taking 50% from each group.

We will calculate the priority scores for targeting the sample by obtaining the lists of data submission issues for each GPRO and issues from CMS. From this information provided by CMS we will prepare a prioritization spreadsheet that tabulates the numbers of issues for each GPRO, including, for example:

  • Submitted rate not equal to calculated rate.

  • Numerator greater than denominator.

  • Provider NPI submitted in multiple instances (individually and within the GPRO).

  • Null or zero rate.

  • Incorrect format.

  • Invalid measure.

In the spreadsheet, we will also include the total submissions for the year for each submitter. The number of issues and total submissions will be converted to standard z scores by subtracting the mean and dividing by the standard deviation. We will then multiply these two z scores together to determine the final priority score for ranking. Thus, submitters with more issues reported and a larger number of submissions will receive the highest scores and have the greatest chance of inclusion in the sample.

The agreed upon sample sizes are provided in Table 2. In addition, in each stratum, we will include a sample of spares to draw from in the event of a non-response. In the case of targeted submitters, the spares will be drawn in the order of the priority score used to select the top submitters. Spares used to replace each non-responder will be drawn simply by moving sequentially down the list, taking the submitter with the next highest score. Non-targeted spares will be selected randomly.


For the survey, we will send notification letters on CMS letterhead and use follow-up phone calls to encourage non-responders to comply with our requests. The notification letter will mention incentives for responding, including having the opportunity to learn best practices for submitting measures and receiving feedback that will help improve the quality of reporting. Consenting submitters will also be recognized by CMS and listed on the QualityNet website as having completed the survey. Sampled entities who do not respond initially will receive a follow-up contact after 30 days, but will have 45 days to respond. After 45 days and 3 follow-up phone calls, non-responders will be dropped and substitutes from the spare sample will be contacted.

Registry

For Registries we will follow the same general process as for selecting GPROs and select 10 Registries.

Registry Data Preparation

The process for preparing data for Registries is similar to that used for GPROs, with the exception that member-level information is not available in the XML extracts. Consequently, we will conduct the following steps to prepare the registry data for sampling:

  • Download from the CMS PQRS data store a Registry PQRS XML submission extract including identifiers for group (name, TIN and NPI) and individual providers (name and NPI), including contact information.

  • Download from the CMS PQRS data store an eRx submission extract including the group (name, TIN and NPI), individual providers (name and NPI), and beneficiaries (name and HICN). Include the contact information (address, phone number, email) where available, and request it if not available.

  • Compile targeting information from CMS PQRS documents describing GPRO reporting, and Registry error reporting logs to identify problem Registries.

  • Merge PQRS and eRx submitter information by registry.

Registry Sampling Methodology

We will construct a stratified sampling frame from the entire list of Registries submitting measures and eRx during 2012. The sample will be stratified by targeted (75%) versus non- targeted submitters (25%). The sample will also be stratified by submitters with or without eRx participation taking 50% from each group.

A prioritization spreadsheet will, created for Registries, will drive the targeted sample selection. The same method, as described above for GPROs, will be used.

The agreed upon sample sizes are provided in Table 3. In addition, within each stratum, we will include a sample of spares to draw from in the event of a non-response. As with the GPROs, the spares for targeted submitters who do not respond will be drawn in the order of the priority score used to select the top submitters. Spares used to replace each non-responder will be drawn simply by moving sequentially down the list, taking the submitter with the next highest score. Non-targeted spares will be selected randomly.

    1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield 'reliable' data that can be generalized to the universe studied.

RESPONSE:

For the survey, we will send notification letters on CMS letterhead and use follow-up phone calls to encourage non-responders to comply with our requests. Again, the notification letters will mention incentives to encourage response. Sampled entities who do not respond will receive a follow-up contact after 30 days, but will have 45 days to respond. After 45 days and 3 follow-up phone calls, non-responders will be dropped and replaced with substitutes from the spare sample.

PRA package contains the list of email templates (PQRS Data Validation Electronic Survey - Att- C-Recruitment-comms.doc) that would be sent out to entities.

    1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

RESPONSE:

A survey pilot test will be administered to 3 GPROs and 6 Registries. Based on the performance in the pilot survey, we will modify the survey by rewording, adding, or deleting questions that do not yield valuable information or result in ambiguous responses. If more than 9 pilot test surveys are required, we will seek PRA approval for the additional surveys.

    1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Contact Name

Telephone Number

Agency Name

Email Address

Tim Champney

703-535-1454

IMS

[email protected]

Mary Braman

202-955-3583

NCQA

[email protected]

Andrew Weller

703-626-6106

IBM

[email protected]



Table 4: Response


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement – Part B
AuthorCMS
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy