CMS-10250 -Supporting Statement B CY2020 11-13-19 V2

CMS-10250 -Supporting Statement B CY2020 11-13-19 V2.docx

Hospital Outpatient Quality Data Program (HOPQDRP) (CMS-10250)

OMB: 0938-1109

Document [docx]
Download: docx | pdf

Supporting Statement - Part B

Submission of Information for the Hospital Outpatient Quality Reporting (OQR) Program


Collection of Information Employing Statistical Methods


1. Describe potential respondent universe.


All hospitals as defined under Section 1886(d)(1)(B) of the Social Security Act (known as “subsection (d) hospitals”), receiving Medicare reimbursement under the Outpatient Prospective Payment System (OPPS) constitute the potential respondent universe; approximately 3,300 hospitals.


2. Describe procedures for collecting information.


Data are submitted via a secure Web site (QualityNet). Data may be: patient-level submitted directly to CMS or aggregate data submitted directly to CMS or the Centers for Disease Control and Prevention (CDC) National Health Safety Network (NHSN) via Web-based tools. Electronic data conforming to a specified format will be collected in a secure relational database.


3. Describe methods to maximize response rates.


To maximize response rates, the Hospital OQR Program provides payment incentive for meeting participation. Hospitals that do not meet program requirements may have a 2.0 percentage point reduction in their OPPS annual payment update. In addition, CMS provides abstraction and submission tools, education, and technical assistance to any hospitals requiring assistance with program requirements.


4. Describe any tests of procedures or methods.


Background History on Validation Policy for Chart-Abstracted Data for the Hospital OQR Program


CMS has requirements for assessing the accuracy of chart-abstracted patient-level data submitted directly to CMS termed “validation requirements”.1 The Hospital OQR Program first finalized data validation requirements for hospitals starting in the CY 2010 OPPS/ASC final rule with comment period, in which CMS finalized their proposal to select a random sample of 7,300 cases, including up to 20 cases per participating hospital. In the CY 2011 OPPS/ASC final rule with comment period, CMS finalized a proposal to validate data from 800 randomly selected hospitals (approximately 20 percent of all participating Hospital OQR Program hospitals) each year, beginning with CY 2012 payment determination. Subsequently, in the CY 2012 OPPS/ASC final rule with comment period, CMS finalized a policy to reduce the number of randomly selected hospitals from 800 to 450, and in addition, finalized a proposal to select up to an additional 50 hospitals based upon targeting criteria. Finally, in the CY 2018 OPPS/ASC final rule with comment period, CMS clarified the hospital selection process previously finalized for validation. Additional details about the history of the Hospital OQR Program validation policies are included below:


  • In the CY 2010 OPPS/ASC final rule with comment period (74 FR 60647 through 60648), CMS finalized a proposal to select a random sample of 7,300 cases from all cases successfully submitted to the OPPS Clinical Warehouse by all participating hospitals from April 1, 2009 through March 31, 2010. The sample also included up to 20 cases per participating hospital. CMS chose a sample size of 7,300 because CMS believed it would enable them to detect a relative difference of 10 percent in the measured overall accuracy rate with a 95 percent (two-tailed) confidence interval, and would also provide sufficient data to conduct post-hoc stratified analyses that provide meaningful feedback. These figures were based upon a power analysis assuming a population measure mismatch rate of five percent with the outcomes being either a match or a mismatch between what the hospital submitted versus what was determined by the validation process.


  • In the CY 2011 OPPS/ASC final rule with comment period (75 FR 72104 through 72105), CMS finalized a proposal to validate data from 800 randomly selected hospitals (approximately 20 percent of all participating Hospital OQR Program hospitals) each year, beginning with CY 2012 payment determination. CMS proposed to sample 800 hospitals because based upon sampling simulation studies using Hospital OQR Program data, that sampling this number would provide a sufficient number for a representative sample of hospitals on various strata (for example, urban, rural, bed-size) while significantly reducing overall hospital burden. For the CY 2012 payment determination, CMS would select only from hospitals participating for the CY 2012 payment update, so if a hospital submitted data for the CY 2011, but withdrew, this hospital would not be deemed as eligible for selection. CMS noted that because 800 hospitals would be selected randomly, every Hospital OQR Program participating hospital would be eligible each year for validation selection. For each selected hospital, CMS would randomly select up to a total of 48 self-reported cases from the total number of cases (12 per quarter) that the hospital successfully submitted to the OPPS Clinical Warehouse.


  • In the CY 2012 OPPS/ASC final rule with comment period (76 FR 74484 through 74485), CMS finalized a policy to reduce the number of randomly selected hospitals from 800 to 450. Because these 450 hospitals will be selected randomly, every Hospital OQR Program participating hospital will be eligible each year for validation selection. To be eligible for random selection for validation, a hospital must be coded as open in the Certification and Survey Provider Enhanced Reporting (CASPER) system at the time of selection and must have submitted at least 10 encounters to the OPPS Clinical Warehouse during the data collection period for the CY 2013 payment determination. In addition, CMS finalized a proposal to select up to an additional 50 hospitals based upon targeting criteria. A hospital could be selected for validation based on targeting criteria if it:

  • Fails the validation requirement that applies to the CY 2012 payment determination; or

  • Has an outlier value for a measure based on the data it submits.2


  • In the CY 2018 OPPS/ASC final rule with comment period (82 FR 52581), CMS noted that the criteria for targeting 50 outlier hospitals, described above, does not specify whether high or low performing hospitals will be targeted, clarifying that hospitals with outlier values indicating specifically poor scores on a measure (for example, a long median time to fibrinolysis) will be targeted for validation. In other words, for the purposes of validation selection under the Hospital OQR Program, an ‘‘outlier value’’ is a measure value that is greater than 5 standard deviations from the mean of the measure values for other hospitals, and indicates a poor score.


Current Validation Policy for Hospital OQR Program


CMS selects 500 hospitals for validation; 450 are selected randomly, and the remaining 50 are selected using the targeted criteria stated in the CY 2013 OPPS/ASC final rule with comment period (77 FR 68484 through 68487) and clarified in the CY 2018 OPPS/ASC final rule with comment period (82 FR 52581). To be eligible for random selection for validation, a hospital must be coded as open in the Certification and Survey Provider Enhanced Reporting (CASPER) system at the time of selection and must have submitted at least 12 encounters to the Hospital OQR Program Clinical Warehouse during the quarter containing the most recently available data.3 The quarter containing the most recently available data is defined based on when the random sample is drawn (79 FR 66965). The additional 50 hospitals are selected for validation based on targeting criteria: having failed the validation requirement that applied to the previous year's payment determination, or having an outlier value for a measure based on finalized criteria from the CY 2012 OPPS/ASC final rule with comment period (76 FR 74485). Hospitals with outlier values indicating specifically poor scores on a measure (for example, a long median time to fibrinolysis) will be targeted for validation (82 FR 52581).


As stated in the CY 2013 OPPS/ASC final rule with comment period (77 FR 68486 through 68487), after the random selection has been completed, the CMS designated contractor known as the CMS Clinical Data Abstraction Center (CDAC) sends record requests by a trackable mail method to the designated Medical Record Contact at the hospital. Each hospital must then submit the requested documentation to the CDAC within 45 calendar days of the date of the request (as documented on the request letter). If the hospital fails to comply within 45 days, a “zero” score is assigned to each data element for each selected case, and the case will fail for all measures in the same topic (for example, OP-18, OP-20, and OP-22 for ED-Throughput).


Validation Response Rates for the Hospital OQR Program


CMS has consistently achieved high response rates from hospitals targeted for validation in the Hospital OQR Program. The response rates for the last four quarterly samples are:


Q4 2017 (October 1 – December 31) – 99.9%

Q1 2018 (January 1 – March 31) – 100%

Q2 2018 (April 1 – June 30) – 100%

Q3 2018 (July 1 – September 30) – 100%


To ensure consistently high response rates from selected hospitals for validation, the CDAC provides a 30-day reminder notice to hospitals that have outstanding medical records.


Once the CDAC receives the requested medical documentation, it independently re-abstracts the same quality measure data elements that the hospital previously abstracted and submitted, and it compares the two sets of data to determine whether they match. A confidence interval using a binomial approach is used in the calculation of validation scores to account for sample variability, the data being analyzed are binary (match, do not match), and to account for the possibility of small sample sizes. To receive a full annual payment update, hospitals must obtain at least a 75 percent validation score for the designated time period based upon this validation process (77 FR 68487).


CMS uses these validation efforts to assess the accuracy of chart-abstracted patient-level data submitted by hospitals to the Hospital OQR Program. Hospital OQR Program data for selected time periods become public as required by section 1833 (t)(17)(E) of the Social Security Act and are posted by the corresponding hospital CMS Certification Number (CCN) on the Hospital Compare website.4 Data are publicly reported on Hospital Compare to: help consumers make better informed decisions, and to assist hospitals in their quality improvement initiatives by providing hospitals an opportunity to view how they are performing in comparison to other hospitals. CMS makes chart-abstracted patient-level data submitted by hospitals to the Hospital OQR Program publicly available on the Hospital Compare website whether or not the data have been validated for payment purposes.


5. Provide name and telephone number of individuals consulted on statistical aspects.


Anita Bhatia James Poyer

410-786-7236 410-786-2261


1 Please see the CY 2013 OPPS/ASC final rule with comment period (77 FR 68484 through 68487), the CY 2015 OPPS/ASC final rule with comment period (79 FR 66964 through 66965), and the CY 2018 OPPS/ASC final rule with comment period (82 FR 52581) for an extensive discussion of finalized policies regarding our validation requirements. We codified these policies at 42 CFR 419.46(e).

2 In the CY 2012 OPPS/ASC final rule with comment period (76 FR 74485), CMS defined an “outlier value” for purposes of this targeting as a measure value that appears to deviate markedly from the measure values for other hospitals. For a normally distributed variable, nearly all values of the variable lie within 3 standard deviations of the mean; very few values lie past the 3 standard deviation mark. One definition of an outlier is a value that exceeds this threshold. In order to target very extreme values, CMS finalized a policy of targeting hospitals that greatly exceed this threshold because such extreme values strongly suggest that the data submitted is inaccurate. Specifically, CMS finalized a policy to select hospitals for validation if their measure value for a measure is greater than 5 standard deviations from the mean, placing the expected occurrence of such a value outside of this range at 1 in 1,744,278. If more than 50 hospitals meet either of the above targeting criteria, then up to 50 would be selected randomly from this pool of hospitals.

3 As stated in the CY 2015 OPPS/ASC final rule with comment period (79 FR 66965, beginning with the CY 2015 encounter period for the CY 2017 payment determination and subsequent years.

4 Quality measure data that does not reach a certain case minimum is not reported on Hospital Compare. For those hospitals that treat a low number of patients but otherwise meet the submission requirements for a particular quality measure, in the CY 2012 OPPS/ASC final rule with comment period (76 FR 74482) CMS finalized a policy that hospitals that have five or fewer encounters (both Medicare and non-Medicare) for any measure included in a measure topic in a quarter would not be required to submit patient level data for the entire measure topic for that quarter. Even if hospitals would not be required to submit patient level data because they have five or fewer encounters (both Medicare and non-Medicare) for any measure included in a measure topic in a quarter, we noted that they may voluntarily do so. Please see the CY 2011 OPPS/ASC final rule with comment period (75 FR 72100 through 72103) and the CY 2012 OPPS/ASC final rule with comment period (76 FR 74482 through 74483) for further discussions of our policy that hospitals may voluntarily submit aggregate population and sample size counts for Medicare and non-Medicare encounters for the measure populations for which chart-abstracted data must be submitted.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement - Part B
AuthorCMS
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy