Appendix E1 Measures under Consideration 2023 Data Template for Cand

Quality Payment Program (QPP)/Merit-Based Incentive Payment System (MIPS) (CMS-10621)

Appendix E1 2023 MUC Data Template

CY 2024 Performance Period/2026 MIPS Payment Year Burden Summary

OMB: 0938-1314

Document [docx]
Download: docx | pdf

Centers for Medicare & Medicaid Services

Measures Under Consideration Entry/Review and Information Tool 2023 Data Template for Candidate Measures


Instructions:

  1. Before accessing the CMS MERIT (Measures Under Consideration Entry/Review and Information Tool) online system, you are invited to complete the measure template below by entering your candidate measure information in the column titled “Add Your Content Here.”

  2. All rows that have an asterisk symbol * in the Field Label require a response unless otherwise indicated in the template.

  3. For each row, the “Guidance” column provides details on how to complete the template and what kinds of data to include. Unless otherwise specified the character limit for text fields in CMS MERIT is 8000 characters.

  4. For check boxes, note whether the field is “select one” or “select all that apply.” You can click on the box to place or remove the “X.”

  5. Numeric fields are noted, where applicable, in the “Add Your Content Here” column.

  6. Row numbers are for convenience only and do not appear on the CMS MERIT user interface.

  7. Send any questions to [email protected].


PROPERTIES


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Information

001

*Measure Title

Provide the measure title only (255 characters or less). Put any program-specific identification (ID) number under Characteristics, not in the title. Note: Do not enter the CMIT ID, consensus-based entity (endorsement) ID, former Jira MUC ID number, or any other ID numbers here (see other fields below). The CMS program name should not ordinarily be part of the measure title, because each measure record already has a required field that specifies the CMS program. An exception would be if there are several measures with otherwise identical titles that apply to different programs. In this case, including or imbedding a program name in the title (to prevent there being any otherwise duplicate titles) is helpful. For additional information on measure title, see: https://mmshub.cms.gov/measure-lifecycle/measure-specification/document-measure.


ADD YOUR CONTENT HERE

Measure Information

002

*Measure description

Provide a brief description of the measure. For additional information on measure description, see: https://mmshub.cms.gov/measure-lifecycle/measure-specification/document-measure.

ADD YOUR CONTENT HERE

Measure Information

003

*Select the CMS program(s) for which the measure is being submitted.

Select all that apply. Please note, measures specified and intended for use at more than one level of analysis must be submitted separately for each level of analysis (e.g., individual clinician, facility). If you choose multiple programs for this submission, please ensure the programs fall under the same level of analysis. If you choose multiple programs and need guidance as to whether your selection represents multiple levels of analysis, please contact [email protected]. There is functionality within CMS MERIT to decrease the data entry process for multiple submissions of the same measure. Please reach out to [email protected]

for guidance and support.


If you are submitting for MIPS, there are two choices of program. Do NOT enter both MIPS-Quality and MIPS-Cost for the same measure. Choose MIPS-Quality for measures that pertain to quality and/or efficiency. Choose MIPS-Cost only for measures that pertain to cost.



Ambulatory Surgical Center Quality Reporting Program

End-Stage Renal Disease (ESRD) Quality Incentive Program

Home Health Quality Reporting Program

Hospice Quality Reporting Program

Hospital Inpatient Quality Reporting Program

Hospital Outpatient Quality Reporting Program

Hospital Readmissions Reduction Program Hospital Value-Based Purchasing Program

Hospital-Acquired Condition Reduction Program

Inpatient Psychiatric Facility Quality Reporting Program

Inpatient Rehabilitation Facility Quality Reporting Program

Long-Term Care (LTC) Hospital Quality Reporting Program

Medicare Promoting Interoperability Program for Eligible Hospitals and Critical Access Hospitals (CAHs)

Medicare Shared Savings Program

Merit-based Incentive Payment System-Cost

Merit-based Incentive Payment System-Quality

Part C & D Star Ratings [Medicare]

Prospective Payment System-Exempt Cancer Hospital Quality Reporting Program

Rural Emergency Hospital Quality Reporting Program

Skilled Nursing Facility Quality Reporting Program

Skilled Nursing Facility Value-Based Purchasing Program

n/a

n/a

If you select “Merit-based Incentive Payment System -Quality” in Row 003, then Row 004 becomes an optional field.

n/a

This is not a data entry field.

Measure Information

004

MIPS Quality: Identify any links with related Cost measures and Improvement Activities

Where available, provide description of linkages and a rationale that correlates this MIPS quality measure to other performance category measures and activities.

ADD YOUR CONTENT HERE

Measure Information

005

*Stage of Development

Select the measure’s current stage of development. A fully developed measure is a measure that has completed beta testing. Note that fully developed measures are highly preferred.


For additional information regarding stage of development, see: https://mmshub.cms.gov/blueprint-measure-lifecycle-overview.


Conceptualization
Specification

Field (Beta) Testing

Fully Developed

n/a

n/a

If you select “Conceptualization,” “Specification”, or “Field (Beta) Testing” in Row 005, then Row 006 becomes a required field. If you select “Fully Developed” in Row 005, then skip to Row 007.

n/a

This is not a data entry field.

Measure Information

006

* Stage of Development Details

If “Conceptualization,” “Specification,” or “Field (Beta) Testing,” describe when testing is planned (i.e., specific dates), what type of testing is planned (e.g., alpha, beta) as well as the types of facilities in which the measure will be tested.


For additional information, see: https://mmshub.cms.gov/blueprint-measure-lifecycle-overview.


ADD YOUR CONTENT HERE

Measure Information

007

*Level of Analysis

Select one. Select the level of analysis at which the measure is specified and intended for use. If the measure is specified and intended for use at more than one level, submit the other levels separately. Any testing results provided in subsequent sections of this submission must be conducted at the level of analysis selected here.


For submission to the MIPS-Quality program, you must report, at minimum, the results of individual clinician-level testing. If testing is performed at both clinician-individual and clinician-group levels of analysis, you may select “Clinician: Individual and Group (MIPS-Quality only).” Please submit results of individual clinician-level testing in this form and group-level testing results in an attachment.

For submission to the MIPS-Cost program, clinician group-level testing is sufficient.

Clinician: Individual only

Clinician: Group

Facility

Clinician: Individual and Group (MIPS-Quality only)

Health plan

Population: Regional and State

Accountable Care Organization

Integrated Delivery System

Medicaid program (e.g., Health Home or 1115)
Population: Community, County or City


Measure Information

008

*In which setting(s) was this measure tested?

Select all that apply.

Ambulatory surgery center

Ambulatory/office-based care

Behavioral health clinic

Inpatient psychiatric facility

Community hospital

Dialysis facility

Emergency department

Federally qualified health center (FQHC)

Health and drug plans

Hospital outpatient department (HOD)

Home health

Hospice

Hospital inpatient acute care facility

Inpatient rehabilitation facility

Long-term care hospital

Nursing home

PPS-exempt cancer hospital

Skilled nursing facility

Veterans Health Administration facility

Not yet tested

Other (enter here):

Measure Information

009

*Multiple Scores

Does the submitter recommend that more than one measure score be reported for this measure (e.g., 7- and 30-day rate, rates for different procedure types, etc.)? Note: If “Yes”, please describe one score only in this form. Submit separate attachments for each of the other scores.


Yes

No

n/a

n/a

If you select “Yes” in Row 009, then Rows 010-012 become required fields. If you select, “No”, then skip to Row 013.

n/a

This is not a data entry field.

Measure Information

010

*Measures with Multiple Scores: Number of Scores

How many measure scores are recommended for this measure?

Numeric field

Measure Information

011

*Measures with Multiple Scores: Names of Score Reported in MERIT Form

Please enter the name of the score described in this MERIT form.

Free text field

Measure Information

012

*Measures with Multiple Scores: Names of Scores

Please enter the names of all additional scores included in this measure but not described in this MERIT form. Please enter the names separated by a semicolon and do not enter any additional information in this field.

Free text field

Measure Information

013

*Is the measure a composite?

Select one. A composite measure contains two or more individual measures, resulting in a single measure and a single score. If this measure is a composite measure, please enter data relevant to the overall composite into this form. Please attach any additional information pertaining to individual components.

Yes

No

Measure Information

014

*Is this a paired measure?

Select one. Paired measures have different measure scores, but results require them to be reported together to be interpreted appropriately.


Note: Individual measures comprising a paired measure must be submitted individually.

Yes

No

n/a

n/a

If you select “Yes” in Row 014, then Rows 015-016 become required fields. If you select “No” in this field, then skip to Row 017.

n/a

This is not a data entry field.

Measure Information

015

*How many measures are intended to be paired with this measure?

How many other measures are intended to be paired with this measure? Do not include this measure in the count.

Numeric field

Measure Information

016

*What are the titles of all measures that should be paired with this measure?

Please enter the measure titles for all other measures that should be paired with this measure. Do not include this measure in the list. Please enter the measure titles separated by a semicolon, and do not enter any additional information in this field.

Free text field

Measure Information

017

*Numerator

The upper portion of a fraction used to calculate a rate, proportion, or ratio. An action to be counted as meeting a measure's requirements. For all fields, especially Numerator and Denominator, use plain text whenever possible. If needed, convert any special symbols, math expressions, or equations to plain text (keyboard alphanumeric, such as + - * /). This will help reduce errors and speed up data conversion, team evaluation, and MUC report formatting.


For all free-text fields: Be sure to spell out all abbreviations and define special terms at their first occurrence. This will save time and revision/editing cycles during clearance.

ADD YOUR CONTENT HERE

Measure Information

018

*Numerator Exclusions

For additional information on exclusions/exceptions, see: https://mmshub.cms.gov/measure-lifecycle/measure-testing/evaluation-criteria/scientific-acceptability/exclusions. If not applicable, enter 'N/A.'

ADD YOUR CONTENT HERE

Measure Information

019

*Denominator

The lower part of a fraction used to calculate a rate, proportion, or ratio. The denominator is associated with a given population that may be counted as eligible to meet a measure’s inclusion requirements.

ADD YOUR CONTENT HERE

Measure Information

020

*Denominator Exclusions

For additional information on exclusions/exceptions, see: https://mmshub.cms.gov/measure-lifecycle/measure-testing/evaluation-criteria/scientific-acceptability/exclusions. If not applicable, enter 'N/A.'

ADD YOUR CONTENT HERE

Measure Information

021

*Denominator Exceptions

For additional information on exclusions/exceptions, see: https://mmshub.cms.gov/measure-lifecycle/measure-testing/evaluation-criteria/scientific-acceptability/exclusions. If not applicable, enter ‘N/A.’

ADD YOUR CONTENT HERE

Measure Information

022

*Briefly describe the rationale for the measure

Briefly describe the rationale for the measure and/or the impact the measure is anticipated to achieve. Details about the evidence to support the measure will be captured in the Evidence section.

ADD YOUR CONTENT HERE


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Implementation

023

*Feasibility of Data Elements

Select one. Select the extent to which the specified data elements are available in electronic fields. Electronic fields should include a designated location and format for the data in claims, EHRs, registries, etc.


  • Select “ALL data elements are in defined fields in electronic sources” if the data elements needed to calculate the measure are all available in discrete and electronically defined fields.

  • Select “Some data elements are in defined fields in electronic sources” if the data elements needed to calculate the measure are not all available in discrete and electronically defined fields.

  • Select “No data elements are in defined fields in electronic sources” if none of the data elements needed to calculate the measure are available in discrete and electronically defined fields.

  • Select “Not applicable" ONLY for measures that are not fully developed OR for CAHPS measures.


For a PRO-PM, select the most appropriate option based on the data collection format(s).

ALL data elements are in defined fields in electronic sources

Some data elements are in defined fields in electronic sources

No data elements are in defined fields in electronic sources

Not applicable

Measure Implementation

024

*Method of measure calculation

Select one. Select the method used to calculate measure scores for the version of the measure proposed in this submission form. Please review guidance before making selections:

  • Select “Electronically Derived Administrative Claims” if the measure can be calculated exclusively from claims data submitted electronically for billing or other purposes.

  • Select “eCQM" if the measure is exclusively specified and formatted to use data from electronic health record (EHRs) and/or health information technology systems, using the Quality Data Model (QDM) to define the data elements and Clinical Quality Language (CQL) to express measure logic.

  • Select “Other digital method” if the measure does not meet the definition of an eCQM as described above, but can be calculated electronically (e.g., registry, MDS, OASIS).

  • Select “Manual abstraction” if all data elements in the measure requires manual review of records, paper-based billing, or manual calculation (e.g., CAHPS).

  • Select “Combination” if two or more types of data sources are required to calculate the measure score.

  • For all other measures that rely on patient surveys (e.g., PRO-PMs), select the option that best describes the way the measure is calculated. For example, if a patient survey is collected electronically and does not require manual abstraction, select "Other digital method" or "eCQM" depending on where the data are collected.

Electronically Derived Administrative Claims

eCQM

Other digital method

Manual abstraction

Combination


Measure Implementation

n/a

If you select "Combination" in Row 024, then Row 025 becomes a required field.

n/a

This is not a data entry field.

Measure Implementation

025

*Combination measure: Methods of calculation

Select all that apply. A minimum of two options must be selected.

Electronically Derived Administrative Claims

eCQM

Other digital method

Manual abstraction

Measure Implementation

026

*How is the measure expected to be reported to the program?

This is the anticipated data submission method. Select all that apply. Use the ” Submitter Comments” field to specify or elaborate on the type of reporting data, if needed to define your measure.

eCQM

Clinical Quality Measure (CQM) Registry

Claims

Web interface

Other (enter here):



Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Burden

027

*Burden for Provider: Was a provider workflow analysis conducted?

Select one. Select "Not applicable" if the measure imposes no burden on the provider (e.g., CAHPS measures or measures based on administrative data (non-claims), claims data).).

Yes

No

Not applicable



n/a

n/a

If you select “Yes” in Row 027, then Rows 028 and 029 become required fields. If you select “No” in Row 027, then skip to Row 030.

n/a

This is not a data entry field.

Burden

028

*If yes, how many sites were evaluated in the provider workflow analysis?

Enter the number of sites that were evaluated in the provider workflow analysis.

Select "Not applicable" if the measure does not impose any burden on providers (e.g., CAHPS measures or measures based on administrative data (non-claims) or claims data).

Numeric field


Burden

029

*Does the provider workflow have to be modified to collect additional data needed to report the measure?

Select one.

If workflow modifications required moderate to significant additional data entry from a clinician or other provider to collect the data elements to report the measure because data are not routinely collected during clinical care or EHR interface changes were necessary, select “Yes.”

If workflow modifications required no, or limited, additional data entry from a clinician or other provider to collect the data elements to report the measure because data are routinely collected during the clinical care and no EHR interface changes were necessary, select “No.”

Yes

No



Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Score Level (Accountable Entity Level) Testing

030

*Reliability

Indicate whether reliability testing was conducted for the accountable entity-level measure scores. Acceptable reliability tests include signal-to-noise (or inter-unit reliability) or random split-half correlation. For more information on accountable entity-level reliability testing, refer to the CMS Measures Management System Blueprint (https://mmshub.cms.gov/measure-lifecycle/measure-testing/evaluation-criteria/scientific-acceptability/reliability) Select “Yes” if acceptable accountable entity-level reliability testing has been completed as of submission of this form.

Select “No” if you are not able to provide the results of acceptable accountable entity-level reliability testing in this submission. If testing results are incomplete, or if you are submitting a different type of reliability testing, provide as an attachment.

Note: This section refers to the reliability of the accountable entity-level measure scores in the final performance measure. For testing of surveys or patient reported tools, refer to the Patient-Reported Data section. Note: for MIPS-Quality submissions, please provide individual clinician-level results. If the measure was also tested at the clinician group level, you may include those results in an attachment.

Yes

No

Measure Score Level (Accountable Entity Level) Testing

031

*Reliability: Type of analysis

Select all that apply.

Signal-to-noise (or inter-unit reliability) is the precision attributed to an actual construct versus random variation (e.g., ratio of between unit variance to total variance) (Adams J. The reliability of provider profiling: a tutorial. Santa Monica, CA: RAND; 2009. http://www.rand.org/pubs/technical_reports/TR653.html).

Random split-half correlation is the agreement between two measures of the same concept, using data derived from split samples drawn from the same entity at a single point in time.

Signal-to-Noise

Random Split-Half Correlation


n/a

n/a

If you select “Signal-to-Noise” in Row 031, then Rows 032-035 become required fields. If you select, “Random Split-Half Correlation” in Row 032, then Rows 036-039 become required fields.

n/a

This is not a data entry field.

Measure Score Level (Accountable Entity Level) Testing

032

*Signal-to-Noise: Level of Analysis

Select the level of analysis at which the signal-to-noise analysis was conducted. If the measure is specified and intended for use at more than one level, ensure the results in this section are at the same level of analysis selected in the Measure Information section of this form.

For MIPS-Quality submissions, you must report the results of individual clinician-level testing. If group-level testing is available, you may submit those results as an attachment.

Accountable Care Organization

Clinician – Individual only

Clinician – Group only

Facility

Health plan

Integrated Delivery System

Population: Community, County or City

Population: Regional and State


Measure Score Level (Accountable Entity Level) Testing

033

*Signal-to-Noise: Sample size

Indicate the number of accountable entities sampled to test the final performance measure. Note that this field is intended to capture the number of measured entities and not the number of individual patients or cases included in the sample.

Numeric field

Measure Score Level (Accountable Entity Level) Testing

034

*Signal-to-Noise: Median Statistical result

Indicate the median result for the signal-to-noise analysis used to assess accountable entity level reliability. Results should range from 0.00 to 1.00. Calculate reliability as the measure is intended to be implemented (e.g., after applying minimum denominator requirements, appropriate type of setting, provider, etc.).

Numeric field

Measure Score Level (Accountable Entity Level) Testing

035

*Signal-to-Noise: Interpretation of results

Describe the type of statistic and interpretation of the results (e.g., low, moderate, high). Provide the distribution of signal-to-noise results across measured entities (e.g., min, max, percentiles). List accepted thresholds referenced and provide a citation. If applicable, include the precision of the statistical result (e.g., 95% confidence interval) and/or an assessment of statistical significance (e.g., p-value).

ADD YOUR CONTENT HERE


Measure Score Level (Accountable Entity Level) Testing

036

*Random Split-Half Correlation: Level of Analysis

Select the level of analysis at which the random split-half analysis was conducted. If the measure is specified and intended for use at more than one level, ensure the results in this section are at the same level of analysis selected in the Measure Information section of this form.

For MIPS-Quality submissions, you must report the results of individual clinician-level testing. If group-level testing is available, you may submit those results as an attachment.

Accountable Care Organization

Clinician – Individual only

Clinician – Group only

Facility

Health plan

Integrated Delivery System

Population: Community, County or City

Population: Regional and State



Measure Score Level (Accountability Entity Level) Testing

037

*Random Split-Half Correlation: Sample size

Indicate the number of accountable entities sampled to test the final performance measure. If number varied by sample, use the largest number of measured entities. Note that this field is intended to capture the number of measured entities and not the number of individual patients or cases included in the sample.

Numeric field

Measure Score Level (Accountability Entity Level) Testing

038

*Random Split-Half Correlation: Statistical result

Indicate the statistical result for the random split-half correlation analysis used to assess accountable entity level reliability. Results should range from -1.00 to 1.00. Calculate reliability as the measure is intended to be implemented (e.g., after applying minimum denominator requirements, appropriate type of setting, provider, etc.).

Numeric field

Measure Score Level (Accountability Entity Level) Testing

039

*Random Split-Half Correlation: Interpretation of results

Describe the type of statistic and interpretation of the results (e.g., low, moderate, high). List accepted thresholds referenced and provide a citation. If applicable, include the precision of the statistical result (e.g., 95% confidence interval) and/or an assessment of statistical significance (e.g., p-value).

ADD YOUR CONTENT HERE


Measure Score Level (Accountability Entity Level) Testing

040

*Empiric Validity

Indicate whether empiric validity testing was conducted for the accountable entity-level measure scores. For more information on accountable entity level empiric validity testing, refer to the CMS Measures Management System Blueprint (https://mmshub.cms.gov/measure-lifecycle/measure-testing/evaluation-criteria/scientific-acceptability/validity)

Note: This section refers to the empiric validity of the accountable entity level measure scores in the final performance measure. Refer to the Patient-Reported Data section for testing of surveys or patient reported tools.

Note: for MIPS-Quality submissions, please provide individual clinician-level results. If the measure was also tested at the clinician group level, you may include those results in an attachment.

Yes

No

n/a

n/a

If you select “Yes” in Row 040, then Rows 041-046 become required fields. If you select “No” in Row 040, then skip to Row 047.

n/a

This is not a data entry field.

Measure Score Level (Accountability Entity Level) Testing

041

*Empiric Validity: Statistic name

Indicate the name for the statistic used to assess accountable entity level validity. Describe whether the result is a relative risk, odds ratio, relative difference in scores, etc.

If more than one test or comparison was conducted, describe the statistic that most strongly supported the validity of the measure and provide the full testing results under the “Methods and findings” question or as an attachment.

ADD YOUR CONTENT HERE


Measure Score Level (Accountable Entity Level) Testing

042

*Empiric Validity: Level of Analysis

Select the level of analysis at which the empiric validity analysis was conducted. If the measure is specified and intended for use at more than one level, ensure the results in this section are at the same level of analysis selected in the Measure Information section of this form.

For MIPS-Quality submissions, you must report the results of individual clinician-level testing. If group-level testing is available, you may submit those results as an attachment.

Accountable Care Organization

Clinician – Individual only

Clinician – Group only

Facility

Health plan

Integrated Delivery System

Population: Community, County or City

Population: Regional and State


Measure Score Level (Accountability Entity Level) Testing

043

*Empiric Validity: Sample size

Indicate the number of accountable entities sampled to test the final performance measure. Note that this field is intended to capture the number of measured entities and not the number of individual patients or cases included in the sample.

ADD YOUR CONTENT HERE


Measure Score Level (Accountability Entity Level) Testing

044

*Empiric Validity: Statistical result

Indicate the statistical result. Calculate empiric validity as the measure is intended to be implemented (e.g., after applying minimum denominator requirements, etc.).

If more than one test or comparison was conducted, provide the result that most strongly supports the validity of the measure and provide the full testing results under the “Methods and findings” question or as an attachment.

Numeric field

Measure Score Level (Accountability Entity Level) Testing

045

*Empiric Validity: Methods and findings

Describe the methods used to assess accountable entity level validity. Describe the comparison groups or constructs used to verify the validity of the measure scores, including hypothesized relationships (e.g., expected to be positively or negatively correlated). Describe your findings for each analysis conducted, including the statistical result provided above and the strongest and weakest results across analyses. If applicable, include the precision of the statistical result(s) (e.g., 95% confidence interval) and/or an assessment of statistical significance (e.g., p-value). If methods and results require more space, include as an attachment.

ADD YOUR CONTENT HERE


Measure Score Level (Accountable Entity Level) Testing

046

*Empiric Validity: Interpretation of results

Indicate whether the statistical result affirmed the hypothesized relationship for the analysis conducted.

Yes

No

Measure Score Level (Accountable Entity Level) Testing

047

*Face validity

Indicate if a vote was conducted among experts and patients/caregivers on whether the final performance measure scores can be used to differentiate good from poor quality of care.

Select “No” if experts and patients/caregivers did not provide feedback on the final performance measure at the specified level of analysis or if the feedback was related to a property of the measure unrelated to its ability to differentiate performance among measured entities.

This item is intended to assess whether face validity testing was conducted on the final performance measure (vs. on the survey). Survey item testing results can be provided in an attachment and described in the Patient-Reported Data Section.

Yes

No

n/a

n/a

If you select “Yes” in Row 047, then Rows 048-051 become required fields. If you select “No” in Row 047, then skip to Row 052.

n/a

This is not a data entry field.

Measure Score Level (Accountable Entity Level) Testing

048

*Face Validity: Level of Analysis

Select the level of analysis for which experts voted on face validity. If the measure is specified and intended for use at more than one level, ensure the results in this section are at the same level of analysis selected in the Measure Information section of this form.

For MIPS-Quality submissions, you must report the results of individual clinician-level testing. If group-level testing is available, you may submit those results as an attachment.

Accountable Care Organization

Clinician – Individual only

Clinician – Group only

Facility

Health plan

Integrated Delivery System

Population: Community, County or City

Population: Regional and State


Measure Score Level (Accountable Entity Level) Testing

049

*Face validity: Number of voting experts and patients/caregivers

Indicate the number of experts and patients/caregivers who voted on face validity (specifically, whether the measure could differentiate good from poor quality care among accountable entities).

Numeric field

Measure Score Level (Accountable Entity Level) Testing

050

*Face validity: Result

Indicate the number of experts and patients/caregivers who voted in agreement that the measure could differentiate good from poor quality care among accountable entities. If votes were conducted using a scale, sum all responses in agreement with the statement. Do not include neutral votes. If more than one question was asked of the experts and patients/caregivers, only provide results from the question relating to the ability of the final performance measure to differentiate good from poor quality care.

Numeric field

Measure Score Level (Accountable Entity Level) Testing

051

Face validity: Interpretation

Briefly explain the interpretation of the result, including any disagreement with the face validity of the performance measure.

Free text field


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Patient/Encounter Level (Data Element Level) Testing

052

*Patient/Encounter Level Testing

Indicate whether patient/encounter level testing of the individual data elements in the final performance measure was conducted (i.e., measure of agreement such as kappa or correlation coefficient). Prior studies of the same data elements may be submitted.


  • Select “Yes” if data element agreement was assessed at the individual data element level or denominator and numerator level as of submission of this form.

  • Select “No” if you are not able to provide the results of data element agreement in this submission. If you are submitting preliminary testing results or a different type of data element testing, provide as an attachment.

  • Select “No” and skip to the Patient-Reported Data section if data element testing was only conducted for a survey or patient reported tool (e.g., internal consistency) rather than data element agreement for the final performance measure.


Note: This section includes tests of both data element reliability and validity.

Yes

No

Not applicable

n/a

n/a

If you select “Yes” in Row 052, then Rows 053-059 become required fields. If you select “No” in Row 052, then skip to Row 060.

n/a

This is not a data entry field.

Patient/Encounter Level (Data Element Level) Testing

053

*Type of Analysis







Select all that apply. For more information on patient/encounter level testing, refer to the CMS Measures Management System Blueprint (https://mmshub.cms.gov/measure-lifecycle/measure-testing/evaluation-criteria/scientific-acceptability/reliability)


Note: This section refers to the patient/encounter level data elements in the final performance measure. Refer to the Patient-Reported Data section for testing of patient/encounter level data elements in surveys or patient reported tools.

Agreement between two manual reviewers

Agreement between eCQM and manual reviewer

Agreement between other gold standard and manual reviewer



Patient/Encounter Level (Data Element Level) Testing

054

*Sample Size

Indicate the number of patients/encounters sampled.

Numeric field

Patient/Encounter Level (Data Element Level) Testing

055

*Statistic Name

Indicate the statistic used to assess agreement (e.g., percent agreement, kappa, positive predictive value, etc.). If more than one type of statistic was calculated, list the one that best depicts the reliability and/or validity of the data elements in your measure.

Percent agreement

Kappa

Correlation coefficient

Sensitivity

Positive Predictive Value


Patient/Encounter Level (Data Element Level) Testing

056

*Statistical Results: Individual Data Element

Indicate the single lowest critical data element result of the statistic selected above. This field is intended to capture the least reliable or valid data element included in the measure. Information about all critical data elements should be provided in the “Interpretation of results” field.


If providing Kappa or a correlation coefficient, results should be between -1 and 1. If providing percent agreement, sensitivity, or positive predictive value, results should be between 0% and 100%


If not tested at the individual data element level, enter 9999.

Numeric field

Patient/Encounter Level (Data Element Level) Testing

057

*Statistical Results: Overall Denominator

After applying denominator exclusions, indicate the result for the overall denominator of the statistic selected above. If not tested at the denominator level, enter 9999.

Numeric field

Patient/Encounter Level (Data Element Level) Testing

058

*Statistical Results: Overall Numerator

Indicate the result for the overall numerator of the statistic selected above. If not tested at the numerator level, enter 9999.

Numeric field

Patient/Encounter Level (Data Element Level) Testing

059

*Interpretation of results

Briefly describe the interpretation of results. Include a list of all data elements tested including their frequency, statistical results, and 95% confidence intervals, as applicable. Include 95% confidence intervals for the overall denominator and numerator results, as applicable. Provide results broken down by test site to demonstrate whether reliability/validity varied between sites, if available. If more room is needed and testing results are included in an attachment (e.g., feasibility scorecard), provide the name of the attachment and location in the attachment.


If any data element has low reliability or validity, describe the anticipated impact and whether it could introduce bias to measure scores. If there is variation in reliability or validity scores across test sites/measured entities, describe how this variation impacts overall interpretation of the results.

ADD YOUR CONTENT HERE


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Patient-Reported Data

060

*Does the performance measure use survey or patient-reported data?

Indicate whether the performance measure utilizes data from structured surveys or patient-reported tools.

Yes

No

n/a

n/a

If you select “Yes” in Row 060, then Row 061 becomes a required field. If you select “No” in Row 060, then skip to Row 065.

n/a

This is not a data entry field.

Patient-Reported Data

061

*Surveys or patient-reported outcome tools

List each survey or patient-reported outcome tool accepted by the performance measure and indicate whether the tool(s) are being used as originally specified and tested or if modifications are required. If available provide each survey or tool as a link or attachment.


Describe the mode(s) of administration available (e.g., electronic, phone, mail) and the number of languages the survey(s) or tool(s) are available in.


Indicate whether any of the surveys or tools is proprietary requiring licenses or fees for use.


ADD YOUR CONTENT HERE


Patient-Reported Data

062

*Survey level testing

Indicate whether each patient survey or patient-reported outcome tool has been validated by a peer reviewed study or empirical testing. For a list of acceptable types of testing, please refer to the latest CMS Blueprint version (https://mmshub.cms.gov/measure-lifecycle/measure-testing/evaluation-criteria/scientific-acceptability/reliability).


Select “Yes” if you can provide relevant testing of the survey or tool conducted either prior to development of the performance measure or as part of the development of the performance measure.


Select “No” if any of the surveys or tools included in the measure have not been validated.

Yes

No

n/a

n/a

If you select “Yes” in Row 062, then Rows 063-064 become required fields. If you select “No” in Row 062, then skip to Row 065.

n/a

This is not a data entry field.

Patient-Reported Data

063

*Type of testing analysis

Select all that apply.

Internal Consistency

Construct Validity

Other (enter here):

Patient-Reported Data

064

*Testing methodology and results

Briefly describe the method used to psychometrically test or validate the patient survey or patient-reported outcome tool. (e.g., Cronbach’s alpha, ICC, Pearson correlation coefficient, Kuder-Richardson test). If the survey or tool was developed prior to the development of the performance measure, describe how the intended use of the survey or tools for the performance measure aligns with the survey or tool as originally designed and tested. Indicate whether the measure uses all components within a tool, or only parts of the tool. Summarize the statistical results and briefly describe the interpretation of results.

ADD YOUR CONTENT HERE




Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Measure Performance

065

*Measure performance - type of score

Select one

Categorical (e.g., yes/no)

Continuous variable (e.g., average)

Count

Frequency Distribution

Non-weighted score/composite scale

Rate

Proportion

Ratio

Weighted score/composite scale

Other (enter here:)

Measure Performance

066

*Measure performance score interpretation

Select one

Better quality = Higher score

Better quality = Lower score

Better quality = Score within a defined interval

Passing score above a specified threshold defines better quality

Passing score below a specified threshold defines better quality


n/a

n/a

If you select “Better quality = Higher score” or “Better quality = Lower score” in Row 066, then Rows 070-079 become required fields. If you select “Better quality = Score within a defined interval” in this field, then Rows 068-079 become required fields. If you select “Passing score above a specified threshold defines better quality” or “Passing score below a specified threshold defines better quality” in this field, then Row 067 and Rows 070-079 become required fields.

n/a

This is not a data entry field

Measure Performance


067


*Passing score


Provide the value that indicates the passing score for the performance measure.


Please enter only one value in the response field and do not enter a range of values.


If unknown or not available, enter 9999.

Numeric field

Measure Performance


068


*Lower limit of defined interval

Provide the lower limit for the performance score’s defined interval.


For example, if the defined interval is 60 - 120 minutes, enter the lower limit of 60 here.


Please enter only one value in the response field and do not enter a range of values.


If unknown or not available, enter 9999.

Numeric field

Measure Performance


069


*Upper limit of defined interval

Provide the upper limit for the performance score’s defined interval.


For example, if the defined interval is 60 – 120 minutes, enter the upper limit of 120 here.


Please enter only one value in the response field and do not enter a range of values.


If unknown or not available, enter 9999.

Numeric field

Measure Performance


070


*Number of accountable entities included in analysis

Provide the number of accountable entities included in the analysis of the distribution of performance scores described in "Overall mean performance score" -"Overall standard deviation of performance scores."


Please enter a single value and do not enter a range.


If unknown or not available, enter 9999.

Numeric field

Measure Performance

071

*Number of accountable entities: unit

Provide the unit of accountable entities included in the analysis of the distribution of performance scores described in "Overall mean performance score" -"Overall standard deviation of performance scores."

ADD YOUR CONTENT HERE

Measure Performance

072

*Overall mean performance score

Provide the mean performance score across accountable entities in the test sample that is relevant to the intended use of the measure.


Note: for MIPS submissions, please provide individual clinician-level results. If the measure was also tested at the clinician group level, you may include those results in an attachment.


Please enter only one value in the response field and do not enter a range of values.


If this is a proportion measure, provide the mean performance score in percentage form, without the symbol. For example, if the mean performance score is 97.9%, enter 97.9 and not 0.979.


If a mean performance score is not available, enter 9999.

Numeric field

Measure Performance

073

*Minimum performance score

Provide the minimum performance score for the testing sample that is relevant to the intended use of the measure.


If this is a proportion measure, provide the minimum performance score in percentage form, without the symbol. For example, if the minimum performance score is 85.6%, enter 85.6 and not 0.856.


If a minimum performance score is not available, enter 9999.

Numeric field

Measure Performance

074

10th percentile

Provide the performance score at the 10th percentile for the testing sample that is relevant to the intended use of the measure.


If this is a proportion measure, provide the 10th percentile score in percentage form, without the symbol. For example, if the 10th percentile performance score is 21.2%, enter 21.2 and not 0.212.


If a 10th percentile performance score is not available, enter 9999.

Numeric field

Measure Performance

075

*50th percentile (median)

Provide the median performance score (50th percentile) for the testing sample that is relevant to the intended use of the measure.


Please enter only one value in the response field and do not enter a range of values.


If this is a proportion measure, provide the median performance score in percentage form, without the symbol. For example, if the median performance score is 85.6%, enter 85.6 and not 0.856.


If a median performance score is not available, enter 9999.

Numeric field

Measure Performance

076

90th percentile

Provide the performance score at the 90th percentile for the testing sample that is relevant to the intended use of the measure.


If this is a proportion measure, provide the 90th percentile score in percentage form, without the symbol. For example, if the 90th percentile performance score is 85.6%, enter 85.6 and not 0.856.


If a 90th percentile performance score is not available, enter 9999.

Numeric field

Measure Performance

077

*Maximum performance score

Provide the maximum performance score for the testing sample that is relevant to the intended use of the measure.


If this is a proportion measure, provide the maximum performance score in percentage form, without the symbol. For example, if the maximum performance score is 85.6%, enter 85.6 and not 0.856.


If a maximum performance score is not available, enter 9999.

Numeric field

Measure Performance

078

*Overall standard deviation of performance scores

Provide the standard deviation of performance scores for the testing sample that is relevant to the intended use of the measure.

Numeric field

Measure Performance

079

*Is there evidence for statistically significant gaps in measure score performance among select subpopulations of interest defined by one or more social risk factors?

Select one. Social risk factors may include age, race, ethnicity, linguistic and cultural context, sex, gender, sexual orientation, social relationships, residential and community environments, Medicare/Medicaid dual eligibility, insurance status (insured/uninsured), urbanicity/rurality, disability, and health literacy.

Yes

No

Not tested


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Importance

080

*Meaningful to Patients. Was input on the final performance measure collected from patients and/or caregivers?

Select one. Input from patients and/or caregivers can include any of the following:

  • Patients

  • Primary caregivers

  • Family

  • Other relatives

Yes

No

n/a

n/a

If you select “Yes” in Row 080, then Rows 081 and 082 become required fields. If you select “No” in Row 080, then skip to Row 083.

n/a

This is not a data entry field.

Importance

081

*Denominator: Total number of patients and/or caregivers who responded to the question asking whether information from the measure (e.g., the measured outcome or process) is important to know about AND can help improve care for patients in similar situations or with similar conditions.

Indicate the total number of patients/caregivers who responded.

Numeric field

Importance

082

*Numerator: Total number of patients and/or caregivers who agreed that information from the measure (e.g., the measured outcome or process) is important to know about AND can help improve care for patients in similar situations or with similar conditions.

Indicate the total number of patients/caregivers who agreed.

Numeric field

Importance

083

*Estimated impact of the measure: Estimate of annual denominator size

Enter the numerical value of the estimated annual denominator size across accountable entities eligible to report the measure. This can be estimated from the average entity-level denominator in the test sample multiplied by the approximate number of eligible entities that may report the measure. If the measure requires a multi-year denominator, divide the estimate to report the estimated number of denominator cases per year rather than for the full denominator period.


If it is not possible to estimate based on the testing sample and other publicly available information, enter 9999.

Numeric field

Importance

084

*Were the measured entities (or others) consulted on the final performance measure to assess whether the measure is easy to understand AND is useful for decision-making?

Select one. The assessment of whether the measure is easy to understand AND useful for decision-making may be obtained from measured entities, or others such as consumers, purchasers, policy makers, etc., using any of the following methods:


  • Focus groups

  • Structured interviews

  • Surveys of potential users


Notes:

  • This is separate from face validity testing of the performance measure.

  • The desired threshold is 60% or greater of measured entities (or others) who respond in agreement that the information produced by the performance measure is easy to understand AND useful for decision-making.

Yes

No

n/a

n/a

If you select “Yes” in Row 084, then Rows 085-086 become required fields. If you select “No” in Row 084, then skip to Row 087.

n/a

This is not a data entry field.

Importance

085

*Denominator: Total number of measured entities (or others) who responded when asked if information produced by the performance measure is easy to understand AND useful for decision-making

Enter the total number of measured entities (or others) who responded when asked if information produced by the performance measure is easy to understand AND useful for decision-making.


Notes:

  • This is separate from any face validity testing.

  • The assessment of understandability and decision-making utility of the measure may be obtained from measured entities, or others such as consumers, purchasers, policy makers, etc.

  • The desired threshold is 60% or greater of measured entities (or others) who respond in agreement that the information produced by the performance measure is easy to understand AND useful for decision-making.

Numeric field

Importance

086

*Numerator: Total number of measured entities (or others) who agreed that information produced by the performance measure is easy to understand AND useful for decision-making

Enter the total number of measured entities (or others) who responded in agreement that the information produced by the performance measure is easy to understand AND useful for decision-making.


Note:

  • This is separate from face validity testing of the performance measure.

  • The assessment of understandability and decision-making utility of the measure may be obtained from measured entities, or others, such as consumers, purchasers, policy makers, etc.,

  • The desired threshold is 60% or greater of those being measured (or others) who respond in agreement that the information produced by the performance measure is easy to understand AND useful for decision-making.

Numeric field

Importance

087

*Estimated impact of the measure: Estimate of annual denominator size: unit

Indicate the unit (e.g., patients) of the estimate of annual denominator size.

Free text field


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Background Information

088

*What is the history or background for including this measure on the current year MUC List?

Select one

New measure never reviewed by Measure Applications Partnership (MAP) Workgroup, or used in a CMS program

Submitted previously but not included in MUC List

Measure previously submitted to MAP, refined, and resubmitted per MAP recommendation

Measure currently used in a CMS program being submitted as-is for a new or different program

Measure currently used in a CMS program, but the measure is undergoing substantial change

n/a

n/a

If you select “New measure never previously submitted to the MUC List, reviewed by Measure Applications Partnership (MAP) Workgroup, or used in a CMS Program” in Row 088, then skip to Row 101". If you select “Measure currently used in a CMS program being submitted as-is for a new or different program” or "Measure currently used in a CMS program, but the measure is undergoing substantial change” then Rows 097-099 become required fields.

n/a

This is not a data entry field.

Previous Measures

089

*Was this measure published on a previous year's Measures Under Consideration list?

Select 'Yes' or 'No'. If yes, you are submitting an existing measure for expansion into additional CMS programs or the measure has substantially changed since originally published.

Yes

No

n/a

n/a

If you select “Yes” in Row 089, then Rows 90-97 become required fields. in the Previous Measures section. If you select “No” in Row 089, then skip to Row 98.

n/a

This is not a data entry field.

Previous Measures

090

*In what prior year(s) was this measure published on the Measures Under Consideration List?

Select all that apply. NOTE: If your measure was published on more than one prior annual MUC List, as you use the MERIT interface, click “Add Another Measure” and complete the information section for each of those years.

2011

2012

2013

2014

2015

2016

2017

2018

2019

2020

2021

2022

Other (enter here):

Previous Measures

091

*What was the MUC ID for the measure in each year?

List both the year and the associated MUC ID number in each year. If unknown, enter N/A.

ADD YOUR CONTENT HERE


Previous Measures

092

*List the CMS CBE MAP workgroup(s) in each year

List both the year and the associated workgroup name in each year. Workgroup options: Clinician; Hospital; Post-Acute Care/Long-Term Care; Coordinating Committee. Example: "Clinician, 2014."

ADD YOUR CONTENT HERE


Previous Measures

093

*What were the programs that MAP reviewed the measure for in each year?

List both the year and the associated CMS programs in each year.

ADD YOUR CONTENT HERE


Previous Measures

094

*What was the MAP recommendation in each year?

List the year(s), the program(s), and the associated recommendation(s) in each year. Options: Support; Do Not Support; Conditionally Support; Refine and Resubmit.

ADD YOUR CONTENT HERE


Previous Measures

095

*Why was the measure not recommended by the MAP workgroups in those year(s)?

Briefly describe the reason(s) if known.

ADD YOUR CONTENT HERE


Previous Measures

096

*MAP report page number being referenced for each year

List both the year and the associated MAP report page number for each year.

ADD YOUR CONTENT HERE


Background Information

097

*Range of year(s) this measure has been used by CMS Program(s).

For example: Hospice Quality Reporting (2012-2018)

ADD YOUR CONTENT HERE


Background Information

098

*What other federal programs are currently using this measure?

Select all that apply. These should be current use programs only, not programs for the upcoming year’s submittal.

Ambulatory Surgical Center Quality Reporting Program

End-Stage Renal Disease (ESRD) Quality Incentive Program

Home Health Quality Reporting Program

Hospice Quality Reporting Program

Hospital-Acquired Condition Reduction Program

Hospital Inpatient Quality Reporting Program

Hospital Outpatient Quality Reporting Program

Hospital Readmissions Reduction Program

Hospital Value-Based Purchasing Program

Inpatient Psychiatric Facility Quality Reporting Program

Inpatient Rehabilitation Facility Quality Reporting Program

Long-Term Care Hospital Quality Reporting Program

Medicare Promoting Interoperability Program for Eligible Hospitals and Critical Access Hospitals (CAHs)

Medicare Shared Savings Program

Merit-based Incentive Payment System-Cost

Part C & D Star Rating [Medicare]

Prospective Payment System-Exempt Cancer Hospital Quality Reporting Program

Rural Emergency Hospital Quality Reporting Program

Skilled Nursing Facility Quality Reporting Program

Skilled Nursing Facility Value-Based Purchasing Program

Other (enter here):

Background Information

099

*How will this measure align with the same measure(s) that are currently used in other federal programs?

Describe how this measure will achieve alignment with the same measure(s) that are currently used in other federal programs. Please include the names of the same measure(s) that are used in other federal programs and include the corresponding unique identifier (e.g., federal program ID, NQF#, etc.), if available.


Alignment is achieved when a set of measures works well across care settings or programs to produce meaningful information without creating extra work for those responsible for the measurement. Alignment includes using the same quality measures in multiple programs when possible. It can also come from consistently measuring important topics across care settings.

ADD YOUR CONTENT HERE

Previous Measures

100

*If this measure is being submitted to meet a statutory requirement, list the corresponding statute

List title and other identifying citation information. If this measure is not being submitted to meet a statutory requirement, enter N/A.

ADD YOUR CONTENT HERE



Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Data Sources

101

*What data sources are used for the measure?

Select all that apply.

Use the next field to specify or elaborate on the type of data source, if needed to define your measure.

Administrative Data (non-claims)

Claims Data

Electronic Clinical Data (non-EHR)

Electronic Health Record

Paper Medical Records

Standardized Patient Assessments

Patient Reported Data and Surveys

Registries

Other (enter here):

Data Sources

102

*The current measure specifications allow for the utilization of at least one digital data source.

Select “Yes” if measure data sources include at least one of the following:

  • Administrative Claims

  • Administrative Data

  • Patient Assessment Instrument (e.g., MDS, LTCH-CARE, OASIS)

  • EHR

  • Registry (e.g., QCDR and Qualified Registry and EQRS)


Select “No” if measure data sources are limited to the following:

  • Chart-Abstracted

  • Survey (For example, currently CAHPS, QRS Survey, HOS are not captured digitally)

  • Part B claims measures (MIPS) reported using Quality Data codes

  • Paper Medical Records

Yes

No

Data Sources

103

If applicable, specify the data source

Use this field to specify or elaborate on the type of data source, if needed, to define your measure.

ADD YOUR CONTENT HERE


Data Sources

104

Description of parts related to each data source

Describe the parts or elements of the measure that are relevant to the selected data sources

ADD YOUR CONTENT HERE





STEWARD


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Steward Information

105

*Measure Steward

Enter the current Measure Steward. Typically, this is an organization or other agency/institution/entity name.

See Appendix A.084-086 for list choices.

Copy/paste or enter your choices here:


Steward Information

106

*Measure Steward Contact Information

Please provide the contact information of the measure steward.

ADD YOUR CONTENT HERE


Long-Term Steward Information

107

*Is the long-term steward different than the steward?

Entity or entities that will be the permanent measure steward(s), responsible for maintaining the measure and conducting CBE endorsement maintenance review. Select all that apply.

Yes

No

n/a

n/a

If you select “Yes” in Row 107, then Row 108 becomes a required field. If you select “No” in Row 107, then skip to Row 109.

n/a

This is not a data entry field.

Long-Term Steward Information

108

*Long-Term Measure Steward Contact Information

If different from Steward above, enter the required contact information for the Long-Term Measure Steward listed above

ADD YOUR CONTENT HERE


Submitter Information

109

Is primary submitter the same as steward?

Select “Yes” or “No.”

Yes

No

Submitter Information

110

*Primary Submitter Contact Information

If different from Steward above: Last name, First name; Affiliation; Telephone number; Email address. NOTE: The primary and secondary submitters entered here do not automatically have read/write/change access to modify this measure in CMS MERIT. To request such access for others, when logged into the CMS MERIT interface, navigate to “About” and “Contact Us,” and indicate the name and e-mail address of the person(s) to be added.

ADD YOUR CONTENT HERE


Submitter Information

111

Secondary Submitter Contact Information

If different from name(s) above: Last name, First name; Affiliation; Telephone number; Email address.

ADD YOUR CONTENT HERE


n/a

n/a

If applicable, select from drop-down menu “Other MERIT users who will contribute to this measure”

n/a

This is not a data entry field.





CHARACTERISTICS


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

General Characteristics

112

*Measure Type

Select only one type of measure. For definitions, see:

https://mmshub.cms.gov/about-quality/new-to-measures/types.

Cost/Resource Use

Efficiency

Intermediate Outcome

Outcome

PRO-PM or Patient Experience of Care

Process

Structure

n/a

n/a

If you select “PRO-PM or Experience of Care” in Row 112, then Row 113 becomes a required field. If you select “Outcome” or “PRO-PM or Experience of Care” in Row 112, then Row 147 in the Evidence section becomes a required field.

n/a

This is not a data entry field.

General Characteristics

113

*Assessment of patient experience of care

Select one. Indicate whether this measure assesses patient experience of care.

Yes

No

General Characteristics

114

*Is this measure in the CMS Measures Inventory Tool (CMIT)?

Select Yes or No. Current measures can be found at https://cmit.cms.gov/CMIT_public/ListMeasures

Yes

No

n/a

n/a

If you select “Yes” in Row 114, then Row 115 becomes a required field. If you select “No” in Row 114, then skip to Row 116.

n/a

This is not a data entry field.

General Characteristics

115

*CMIT ID

If the measure is currently in CMIT, enter the CMIT ID in the format #####-X-XXXXXXX. Current measures and CMIT IDs can be found at https://cmit.cms.gov/CMIT_public/ListMeasures

ADD YOUR CONTENT HERE


General Characteristics

116

Alternate Measure ID

This is an alphanumeric identifier (if applicable), such as a recognized program ID number for this measure (20 characters or less). Examples: 199 GPRO HF-5; ACO 28; CTM-3; PQI #08. DO NOT enter consensus-based entity (endorsement) ID, CMIT ID, or previous year MUC ID in this field.

ADD YOUR CONTENT HERE


General Characteristics

117

*What is the target population of the measure?

What populations are included in this measure? e.g., Medicare Fee for Service, Medicare Advantage, Medicaid, Children’s Health Insurance Program (CHIP), All Payer, etc.

ADD YOUR CONTENT HERE


General Characteristics

118

*What one area of specialty the measure is aimed to, or which specialty is most likely to report this measure?

Select the ONE most applicable area of specialty.

See Appendix A.097 for list choices. Copy/paste or enter your choice(s) here:


General Characteristics

119

*Evidence of performance gap

Evidence of a performance gap among the units of analysis in which the measure will be implemented. Provide analytic evidence that the units of analysis have room for improvement and, therefore, that the implementation of the measure would be meaningful.


If you have lengthy text add the evidence as an attachment, named to clearly indicate the related form field.

ADD YOUR CONTENT HERE


General Characteristics

120

*Unintended consequences

Summary of potential unintended consequences if the measure is implemented. Information can be taken from the CMS consensus-based entity Consensus Development Process (CDP) manuscripts or documents. If referencing CDP documents, you must submit the document or a link to the document, and the page being referenced.

ADD YOUR CONTENT HERE



Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Evidence

121

*Type of evidence to support the measure

Select all that apply. Refer to the latest CMS Blueprint version (https://mmshub.cms.gov/measure-lifecycle/measure-conceptualization/information-gathering-overview) and the supplementary material related to evidence review (https://mmshub.cms.gov/sites/default/files/Environmental-Scans.pdf) to obtain updated guidance.

Clinical Guidelines or USPSTF (U.S. Preventive Services Task Force) Guidelines

Peer-Reviewed Systematic Review

Peer-Reviewed Original Research

Empirical data

Grey Literature

n/a

n/a

If you select “Clinical Guidelines or USPSTF (U.S. Preventive Services Task Force) Guidelines” in Row 121, then Rows 122-129 become required fields. If you select “Systematic Review” in Row 121, then Rows 131 and 137-139 become required fields. If you select “Empirical data” in Row 121, then Rows 131 and 142-144 become required fields. If you select “Grey Literature” in Row 121, then Rows 131 and 145-147 become required fields.

n/a

This is not a data entry field.

Evidence

122

*Number of clinical guidelines, including USPSTF guidelines that address this topic

Enter a numerical value of ≥1. Count all guidelines that are relevant to this measure topic including those that offer contradictory guidance.

Numeric field

Evidence

123

*Outline the clinical guideline(s) supporting this measure

Provide a detailed description of which guideline(s) support the measure and indicate for each, whether they are evidence-based or consensus-based.


Summarize the meaning/rationale of the guideline statements that are being referenced, their relation to the measure concept and how they support the measure whether directly or indirectly, and how the guideline statement(s) relate to the measure’s intended accountable entity. Describe the body of evidence that supports the statement(s) by describing the quantity, quality and consistency of the studies that are pertinent to the guideline statements/sentence. Quantity of studies represent the number of studies and not the number of publications associated with a study. If the statement is advised by 3 publications reporting outcomes from the same RCT at 3 different time points, this is considered a single study and not 3 studies.


If referencing a standard norm which may or may not be driven by evidence, provide the description and rationale for this norm or threshold as reasoned by the guideline panel.


If this is an outcome measure or PRO-PM, indicate how the evidence supports or demonstrates a link between at least one process, structure, or intervention and the outcome.


Document the criteria used to assess the quality of the clinical guidelines such as those proposed by the Institute of Medicine or ECRI Guideline’s Trust (see CMS Blueprint version (https://mmshub.cms.gov/measure-lifecycle/measure-conceptualization/information-gathering-overview) and the supplementary material related to evidence review (https://mmshub.cms.gov/sites/default/files/Environmental-Scans.pdf).


If there is lengthy text, describe the guidelines in an evidence attachment, named to clearly indicate the related form field.

ADD YOUR CONTENT HERE


Evidence

124

*Name the guideline developer/entity

If the response to the Number of clinical guidelines, including USPSTF guidelines, that address this measure topic is >1, identify the guideline that most closely aligns with and supports your measure concept. This is now referred to as the primary clinical guideline.


Spell out the primary clinical guideline entity’s name followed by the appropriate acronym, if available.

For example: United States Preventive Services Task Force (USPSTF)

ADD YOUR CONTENT HERE


Evidence

125

*Publication year

Provide the publication year for the primary clinical guideline.

Use the 4-digit format (e.g., 2016).

Numeric field (4-digit year)

Evidence

126

*Guideline citation

Provide any of the following:

  • Full citation for the primary clinical guideline in any established citation style (e.g., AMA, APA, Chicago, Vancouver, etc.)

  • URL

  • DOI or ISBN for clinical guideline document

Citation (enter here)

URL (enter here)

DOI (enter here)

Not available


Evidence

127

*Is this an evidence-based clinical guideline

There are disparate methods of developing clinical guidance documents. An evidence-based guideline is one which uses evidence to inform the development of their recommendations. The evidence must be reviewed in a deliberate, systematic manner. To determine this, the developer must have provided a description of a systematic search of literature and their search strategy which includes the dates of the literature covered, databases consulted, and a screening, review and data extraction process.


Select “No” for clinical guidelines that are based purely on expert consensus with or without supplementation with a narrative literature review (non-systematic).

Yes

No

Evidence

128

*Does the clinical guideline include a publicly available evidence summary?

Evidence-based clinical guidelines should be accompanied by a publicly available evidence summary. If the guideline includes an evidence summary, please select “Yes” and provide a link to the evidence summary in the text box.

Yes (enter URL here:)

No

Evidence

129

*Is the selected guideline statement used to support an inappropriate use/care measure?

Select one. Indicate whether the guideline statement mentioned in “List the guideline statement that most closely aligns with the measure concept" (row 131) is used to promote the practice of not performing a specific action, process or intervention to support an inappropriate use or inappropriate care measure.

Yes

No

Evidence

130

*For the guideline statement that most closely aligns with the measure concept, what is the associated level of evidence or level of certainty in the evidence?

Select the associated level of evidence or certainty of evidence using the convention used by the guideline developer.


Select one.

High or similar

Moderate or similar

Low, Very Low or similar

Other (enter here)

Evidence

131

*List the guideline statement that most closely aligns with the measure concept.

If there are more than one statement from this clinical guideline that may be relevant to this measure concept, document the statement that most closely aligns with the measure concept as it is written in the guideline document. For example, Statement 1: In patients aged 65 years and older who have prediabetes, we recommend a lifestyle program similar to the Diabetes Prevention Program to delay progression to diabetes. No more than one statement should be written in the text box. All other relevant statements should be submitted in a separate evidence attachment.

ADD YOUR CONTENT HERE

Evidence

132

*Is the guideline graded?

A graded guideline is one which explicitly provides evidence rating and recommendation grading conventions in the document itself. Grades are usually found next to each recommendation statement.


Select one.

Yes

No

n/a

n/a

If you select “Yes” in Row 132, then Rows 133-138 become required fields.

n/a

This is not a data entry field.

Evidence

133

*What evidence grading system did the guideline use to describe strength of recommendation?

Select the evidence grading system used by the clinical guideline. (e.g., GRADE or USPSTF) to describe the guideline statement’s strength of recommendation.

GRADE method

Modified GRADE

USPSTF

Other (enter here)

Evidence

134

*List all categories and corresponding definitions for the evidence grading system used to describe strength of recommendation in the guideline.

Insert the complete list of grading categories and their definitions.

ADD YOUR CONTENT HERE


Evidence

135

*For the guideline statement that most closely aligns with the measure concept, what is the associated strength of recommendation?

Select the associated strength of recommendation using the convention used by the guideline developer.


Select one.

USPSTF Grade A, Strong recommendation or similar

USPSTF Grade B, Moderate recommendation or similar

USPSTF Grade C or I, Conditional/weak recommendation or similar

Expert Opinion

USPSTF Grade D, Moderate or high certainty that service has no net benefit or harm outweighs benefit

Best Practice Statement/Standard Practice

Evidence

136

*List all categories and corresponding definitions for the evidence grading system used to describe level of evidence or level of certainty in the evidence?

Insert the complete list of grading categories and their definitions.

ADD YOUR CONTENT HERE


Evidence

137

*Number of systematic reviews that inform this measure concept

Insert the number of peer reviewed systematic reviews that addresses this measure topic. This includes systematic reviews that address the same intervention/ process/ structure but may have conflicting conclusions.


Enter a numerical value of greater than or equal to 1.

Numeric field

Evidence

138

*Briefly summarize the peer-reviewed systematic review(s) that inform this measure concept

Summarize the peer-reviewed systematic review(s) that address this measure concept. For each systematic review, provide the number of studies within the systematic review that addressed the specifications defined in this measure concept, indicate whether a study-specific risk of bias/quality assessment was performed for each study, and describe the consistency of findings. Number of studies is not equivalent to the number of publications. If there are three publications from a single cohort study cited in the systematic review, report one when indicating the number of studies. If this is an outcome measure or PRO-PM, indicate how the evidence supports or demonstrates a relationship between at least one process, structure, or intervention with the outcome.


If there is lengthy text, submit details via an evidence attachment.

ADD YOUR CONTENT HERE


Evidence

139

*Peer-reviewed systematic review citation

If more than one article was identified, provide at least one of the following for one key article:

  • Citation

  • URL

  • DOI


Provide the complete list of citations with accompanying DOI or URL in a separate attachment.

Citation (enter here:)

URL (enter here:):

DOI (enter here:)

Not available

Evidence

140

*Peer-reviewed original research

If the evidence synthesis provided to support this measure concept was performed using peer-reviewed original research articles, indicate whether a systematic search of the literature was conducted.

Yes (please provide search strategy in an attachment; e.g., years searched, keywords and search terms used, databases used, etc.)

No

Evidence

141

*Peer-reviewed original research citation

If more than one article was identified, provide at least one of the following for one key article:

  • Citation

  • URL

  • DOI


Provide the complete list of citations with accompanying DOI or URL in a separate attachment.

Citation (enter here:)

URL (enter here:):

DOI (enter here:)

Not available

Evidence

142

*Source of empirical data

Select all that apply

Peer-reviewed narrative literature review

Published and publicly available reports (e.g., from agencies)

Internal data analysis

Other (enter here)

Evidence

143

*Summarize the empirical data

Provide a summary of the empirical data and how it informs this measure concept. Describe the limitations of the data. If this is an outcome measure or PRO-PM, indicate how the evidence supports or demonstrates a link between at least one process, structure, or intervention with the outcome. If there is lengthy text, include details in a separate evidence attachment.

ADD YOUR CONTENT HERE


Evidence

144

*Empirical data citation

If more than one empirical data was identified, provide at least one of the following for one key empirical data:

  • Citation

  • URL

  • DOI


Provide the complete list of citations with accompanying DOI or URL in a separate attachment.

Citation (enter here:)

URL (enter here:):

DOI (enter here:)

Not available

Evidence

145

*Name grey literature

If more than one grey literature was identified, provide at least one of the following for one key piece of evidence:

  • Citation

  • URL

  • DOI

Provide the complete list of citations with accompanying DOI or URL in a separate attachment.


ADD YOUR CONTENT HERE


Evidence

146

*Summarize the grey literature

Provide a summary of the grey literature(s) used to inform this measure concept. Describe the limitations of the data. If this is an outcome measure or PRO-PM, indicate how the evidence supports or demonstrates a link between at least one process, structure, or intervention with the outcome.



ADD YOUR CONTENT HERE


Evidence

147

*Grey literature citation

If more than one grey literature was identified, provide at least one of the following for one key piece of evidence:

  • Citation

  • URL

  • DOI


Provide the complete list of citations with accompanying DOI or URL in a separate attachment.

Citation (enter here:)

URL (enter here:):

DOI (enter here:)

Not available

Evidence

148

*Does the evidence discuss a relationship between at least one process, structure, or intervention with the outcome?

Select yes if the evidence that was discussed in the evidence section demonstrate a relationship between at least one process, structure, or intervention with the outcome.

Yes

No


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Risk Adjustment and Stratification

149

*Was risk adjustment and/or stratification considered?

Select “Yes” if the measure development process included consideration of risk adjustment and/or stratification, even if the final measure does not include risk adjustment or stratification. While risk adjustment is typically only required for outcome measures, other measure types can select “Yes” if risk adjustment or stratification was considered.


Select “No” if neither risk adjustment nor stratification was considered as part of the measure development process.

Yes

No

n/a

n/a

If you select “Yes” in Row 149, then Row 150, 152, and 161 become required fields. If you select “No” in Row 149, then skip to Row 163.

n/a

This is not a data entry field.

Risk Adjustment and Stratification

150

*Was a conceptual model outlining the pathway between patient risk factors, quality of care, and the outcome of interest established?

Select “Yes” if a conceptual model was established based on a review of published literature. The conceptual model can be supplemented by other sources of information such as expert opinion or empirical analysis.


Select “No” if a conceptual model was not established or the conceptual model was based solely on expert opinion or empirical analysis.

Yes

No

n/a

n/a

If you select “Yes” in Row 150, then Row 151 becomes a required field. If you select “No” in Row 150, then skip to Row 152.

n/a

This is not a data entry field.

Risk Adjustment and Stratification

151

*Were all key risk factors identified in the conceptual model available for testing?

If some key risk factors were not available for testing or inclusion in the risk model/stratification approach, select “No” and describe the anticipated impact on measure scores (e.g., magnitude and direction of bias).

Yes

No (enter here:)

Risk Adjustment and Stratification

152

*Is the measure risk adjusted?

Indicate whether the final measure is risk adjusted.

Yes

No

n/a

n/a

If you select “Yes” in Row 152, then Rows 153-160 become required fields. If you select “Yes” in Row 152, you are also encouraged to upload documentation about your risk adjustment model as an attachment. If you select “No” in Row 152, then skip to Row 161.

n/a

This is not a data entry field.

Risk Adjustment and Stratification

153

Risk adjustment variable types

Select ALL risk adjustment variable types that are included in your final risk model. For more information on how to select risk factors for accountability measures, refer to the CMS Measures Management System Blueprint (https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/MMS/Downloads/Blueprint.pdf).


Select “Patient-level demographics” if the measure uses information related to each patient’s age, sex, race/ethnicity, etc.


Select “Patient-level health status & clinical conditions” if the measure uses information specific to each individual patient about their health status prior to the start of care (e.g., case-mix adjustment).


Select “Patient functional status” if the measure uses information specific to each individual patient’s functional status prior to the start of care (e.g., body function, ability to perform activities of daily living, etc.)


Select “Patient-level social risk factors” if the measure uses patient-reported information related to their individual social risks (e.g., income, living alone, etc.).


Select “Proxy social risk factors” if the measure uses data related to characteristics of the people in the patient’s community (e.g., neighborhood level income from the census).


Select “Patient community characteristics” if the measure uses information about the patient’s community (e.g., percent of vacant houses, crime rate).


Select “Other” if the risk factor is related to the healthcare provider, health system, or other factor that is not related to the patient.

Patient-level demographics

Patient-level health status & clinical conditions

Patient functional status

Patient-level social risk factors

Proxy social risk factors

Patient community characteristic

Other (enter here):

Risk Adjustment and Stratification

n/a

If you select “Patient-Level Demographics” in Row 153, then Row 154 becomes a required field. If you select “Patient-level health status & clinical conditions” in Row 153, then Row 155 becomes a required field. If you select “Patient functional status” in Row 153, then Row 156 becomes a required field. If you select “Patient-level social risk factors” in Row 153, then Row 157 becomes a required field. If you select “Proxy social risk factors” in Row 153, then Row 158 becomes a required field. If you select “Patient community characteristics” in Row 153, then Row 159 becomes a required field.

n/a

This is not a data entry field.

Risk Adjustment and Stratification

154

*Patient-level demographics: please select all that apply

Select all that apply

Age

Sex

Gender

Race/ethnicity

Other (enter here):

Risk Adjustment and Stratification

155

*Patient-level health status & clinical conditions: please select all that apply

Select all that apply

Case-Mix Adjustment

Severity of Illness

Comorbidities

Health behaviors/health choices

Other (enter here):

Risk Adjustment and Stratification

156

*Patient functional status: please select all that apply

Select all that apply

Body Function

Ability to perform activities of daily living

Other (enter here):

Risk Adjustment and Stratification

157

*Patient-level social risk factors: please select all that apply

Select all that apply

Income

Education

Wealth

Living Alone

Social Support

Other (enter here):


Risk Adjustment and Stratification

158

*Proxy social risk factors: please select all that apply

Select all that apply

Neighborhood Level Income from the Census

Dual Eligibility for Medicare and Medicaid

Other (enter here):


Risk Adjustment and Stratification

159

*Patient community characteristics: please select all that apply

Select all that apply

Percent of Vacant Houses

Crime Rate

Urban/Rural

Other (enter here):


Risk Adjustment and Stratification

160

*Risk model performance

Provide empirical evidence that the risk model adequately accounts for confounding factors (e.g., assessment of model calibration and discrimination). Describe your interpretation of the results.

ADD YOUR CONTENT HERE


Risk Adjustment and Stratification

161

*Is the measure recommended to be stratified?

Indicate whether the final measure is recommended to be stratified.

Yes

No

n/a

n/a

If you select “Yes” in Row 161, then Row 162 becomes a required field. If you select “No” in Row 161 and “No” in Row 152, then Row 163 becomes a required field.

n/a

This is not a data entry field.

Risk Adjustment and Stratification

162

*Stratification approach

Describe the recommended stratification approach including the data elements used to stratify scores for at-risk subgroups. Demonstrate that there is sufficient sample size within measured entities to stratify measure scores. If more room is needed, provide testing results as an attachment and list the name of the attachment in this field.

ADD YOUR CONTENT HERE

Risk Adjustment and Stratification

163

*Rationale for using neither risk adjustment nor stratification

Select ALL reasons for not implementing a risk adjustment model or stratification approach in the measure. For more information, refer to the CMS Measures Management System Blueprint Risk Adjustment in Quality Measurement supplement (https://mmshub.cms.gov/sites/default/files/Risk-Adjustment-in-Quality-Measurement.pdf) and the guidance on defining stratification schemes (https://mmshub.cms.gov/measure-lifecycle/measure-specification/develop-specification/stratification)

Addressed through exclusions (e.g., process measures)

Risk adjustment not appropriate based on conceptual or empirical rationale (enter here):

Data were not available to evaluate risk adjustment or stratification (enter here):

Other (enter here):


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Healthcare Domain

164

*What one Meaningful Measures 2.0 priority is most applicable to this measure?

Select the ONE most applicable Meaningful Measures 2.0 priority. For more information, see: https://www.cms.gov/meaningful-measures-20-moving-measure-reduction-modernization


Person-Centered Care

Equity

Safety

Affordability and Efficiency

Chronic Conditions

Wellness and Prevention

Seamless Care Coordination

Behavioral Health


Healthcare Domain

165

What, if any, additional Meaningful Measures 2.0 priorities apply to this measure?

Select up to two additional Meaningful Measures 2.0 priorities that apply to this measure.


For more information, see: https://www.cms.gov/meaningful-measures-20-moving-measure-reduction-modernization

Person-Centered Care

Equity

Safety

Affordability and Efficiency

Chronic Conditions

Wellness and Prevention

Seamless Care Coordination

Behavioral Health


Other Priorities

166

*Does this measure address CMS priorities to improve maternal health care and maternal outcomes?

Select one.

Yes

No


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Endorsement Characteristics

167

*What is the endorsement status of the measure?

Select only one. For information on consensus-based entity (CMS contractor) endorsement, measure ID, and other information, refer to: https://p4qm.org/

Endorsed

Endorsement removed

Submitted

Failed endorsement

Never submitted

Endorsement Characteristics

168

*CBE ID (CMS consensus-based entity, or endorsement ID)

Four- or five-character identifier with leading zeros and following letter if needed. Add a letter after the ID (e.g., 0064e) and place zeros ahead of ID if necessary (e.g., 0064). If no CBE ID number is known, enter numerals 9999.

ADD YOUR CONTENT HERE


Endorsement Characteristics

169

If endorsed: Is the measure being submitted exactly as endorsed by the CMS CBE?

Select 'Yes' or 'No'. Note that 'Yes' should only be selected if the submission is an EXACT match to the CBE-endorsed measure.

Yes

No

n/a

n/a

If you select “No” in Row 169, then Rows 170-171 become required fields.

n/a

This is not a data entry field.

Endorsement Characteristics

170

If not exactly as endorsed, specify the locations of the differences

Indicate which specification fields are different. Select all that apply

Measure title

Description

Numerator

Denominator

Exclusions

Target population

Setting (for testing)

Level of analysis

Data source

eCQM status

Other (enter here and see next field):

Endorsement Characteristics

171

If not exactly as endorsed, describe the nature of the differences

Briefly describe the differences

ADD YOUR CONTENT HERE


Endorsement Characteristics

172

If endorsed: Year of most recent CDP endorsement

Select one

2017

2018

2019

2020

2021

2022

2023

Endorsement Characteristics

173

Year of next anticipated CDP endorsement review

Select one. If you are submitting for initial endorsement, select the anticipated year.

2022

2023

2024

2025

2026

2027








GROUPS


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

n/a

174

*Is this measure an electronic clinical quality measure (eCQM)?

Select 'Yes' or 'No'. If your answer is yes, the Measure Authoring Tool (MAT) ID number must be provided below. For more information on eCQMs, see: https://www.emeasuretool.cms.gov/

Yes

No

n/a

n/a

If you select “Yes” in Row 174, then Rows 175-177 become required fields. If you select “No” in Row 174, then skip to Row 178.

n/a

This is not a data entry field.

n/a

175

*Measure Authoring Tool (MAT) Number

You must attach Bonnie test cases for this measure, with 100% logic coverage (test cases should be appended), attestation that value sets are published in Value Set Authority Center (VSAC), and feasibility scorecard. If not an eCQM, or if MAT number is not available, enter 0.

ADD YOUR CONTENT HERE


n/a

176

*If eCQM, does the measure have a Health Quality Measures Format (HQMF) specification in alignment with the latest HQMF and eCQM standards, and does the measure align with Clinical Quality Language (CQL) and Quality Data Model (QDM)?

Select 'Yes' or 'No'. For additional information on HQMF standards, see: https://ecqi.healthit.gov/tool/hqmf

Yes

No

n/a

177

*If eCQM, does any electronic health record (EHR) system tested need to be modified?

Select “Yes” if any of the EHR systems tested had to modify how data were entered by providers or stored to facilitate calculation of the eCQM.


Select “No” if the data needed to calculate the eCQM were already included in structured fields in the EHR systems tested and none of them needed to be modified.

Yes

No



RELATED AND COMPETING MEASURES


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

Similar In-Use Measures

178

*Is this measure similar and/or competing with measure(s) already in a program?

Select either Yes or No. Consider other measures with similar purposes.

Yes

No

n/a

n/a

If you select “Yes” in Row 178, then Rows 179-181 become required fields. If you select “No” in Row 178, then skip to Row 182.

n/a

This is not a data entry field.

Related and Competing Measures

179

*Which measure(s) already in a program is your measure similar to and/or competing with?

Identify the other measure(s) including title and any other unique identifier.

ADD YOUR CONTENT HERE


Related and Competing Measures

180

*How will this measure add value to the CMS program?

Describe benefits of this measure, in comparison to measure(s) already in a program.

ADD YOUR CONTENT HERE


Related and Competing Measures

181

*How will this measure be distinguished from other similar and/or competing measures?

Describe key differences that set this measure apart from others.

ADD YOUR CONTENT HERE





ATTACHMENTS


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

N/A

182

Attachment(s)

You are encouraged to attach the measure information form (MIF) if available. This is a detailed description of the measure used by the CMS consensus-based entity (CBE) during endorsement proceedings. If a MIF is not available, comprehensive measure methodology documents are encouraged.

If you are submitting for MIPS (either Quality or Cost), you are required to download the MIPS Peer Reviewed Journal Article Template and attach the completed form to your submission using the “Attachments” feature. See https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/QualityMeasures/Pre-Rulemaking


If your measure is risk adjusted, you are encouraged to attach documentation that provides additional detail about the measure risk adjustment model such as variables included, associated code system codes, and risk adjustment model coefficients


If eCQM, you must attach MAT Output/HQMF, Bonnie test cases for this measure, with 100% logic coverage (test cases should be appended), attestation that value sets are published in VSAC, and feasibility scorecard.

ADD YOUR CONTENT HERE


N/A

183

MIPS Peer Reviewed Journal Article Template

Select Yes or No. For those submitting measures to MIPS program, enter “Yes.” Attach your completed Peer Reviewed Journal Article Template.

Yes

No


SUBMITTER COMMENTS


Subsection

Row

Field Label

Guidance

ADD YOUR CONTENT HERE

N/A

184

Submitter Comments

Any notes, qualifiers, external references, or other information not specified above.

ADD YOUR CONTENT HERE




Send any questions to
[email protected]

Appendix: Lengthy Lists of Choices


A. 084-086 Choices for Measure Steward (084) and Long-Term Measure Steward (if different) (086)


Agency for Healthcare Research & Quality

Alliance of Dedicated Cancer Centers

Ambulatory Surgical Center (ASC) Quality Collaboration

American Academy of Allergy, Asthma & Immunology (AAAAI)

American Academy of Dermatology

American Academy of Neurology

American Academy of Ophthalmology

American Academy of Otolaryngology – Head and Neck Surgery (AAOHN)

American College of Cardiology

American College of Cardiology/American Heart Association

American College of Emergency Physicians

American College of Emergency Physicians (previous steward Partners-Brigham & Women's)

American College of Obstetricians and Gynecologists (ACOG)

American College of Radiology

American College of Rheumatology

American College of Surgeons

American Gastroenterological Association

American Health Care Association

American Medical Association

American Nurses Association

American Psychological Association

American Society for Gastrointestinal Endoscopy

American Society for Radiation Oncology

American Society of Addiction Medicine

American Society of Anesthesiologists

American Society of Clinical Oncology

American Society of Clinical Oncology

American Urogynecologic Society

American Urological Association (AUA)

Audiology Quality Consortium/American Speech-Language-Hearing Association (AQC/ASHA)

Bridges to Excellence

Centers for Disease Control and Prevention

Centers for Medicare & Medicaid Services

Eugene Gastroenterology Consultants, PC Oregon Endoscopy Center, LLC

Health Resources and Services Administration (HRSA) - HIV/AIDS Bureau

Heart Rhythm Society (HRS)

Indian Health Service

Infectious Diseases Society of America (IDSA)

Intersocietal Accreditation Commission (IAC)

KCQA- Kidney Care Quality Alliance

Minnesota (MN) Community Measurement

National Committee for Quality Assurance

National Minority Quality Forum

Office of the National Coordinator for Health Information Technology/Centers for Medicare & Medicaid Services

Oregon Urology Institute

Oregon Urology Institute in collaboration with Large Urology Group Practice Association

Pharmacy Quality Alliance

Philip R. Lee Institute for Health Policy Studies

Primary (care) Practice Research Network (PPRNet) 

RAND Corporation

Renal Physicians Association; joint copyright with American Medical Association -

Seattle Cancer Care Alliance

Society of Gynecologic Oncology

Society of Interventional Radiology

The Academy of Nutrition and Dietetics

The Joint Commission

The Society for Vascular Surgery

The University of Texas MD Anderson Cancer Center

University of Minnesota Rural Health Research Center

University of North Carolina- Chapel Hill

Wisconsin Collaborative for Healthcare Quality (WCHQ)

Other (enter in Row 084 and/or Row 086)



A.097 Choices for Areas of specialty (097)


Addiction medicine

Allergy/immunology

Anesthesiology

Behavioral health

Cardiac electrophysiology

Cardiac surgery

Cardiovascular disease (cardiology)

Chiropractic medicine

Colorectal surgery (proctology)

Critical care medicine (intensivists)

Dermatology

Diagnostic radiology

Electrophysiology

Emergency medicine

Endocrinology

Family practice

Gastroenterology

General practice

General surgery

Geriatric medicine

Gynecological oncology

Hand surgery

Hematology/oncology

Hospice and palliative care

Infectious disease

Internal medicine

Interventional pain management

Interventional radiology

Maxillofacial surgery

Medical oncology

Nephrology

Neurology

Neuropsychiatry

Neurosurgery

Nuclear medicine

Nursing

Nursing homes

Obstetrics/gynecology

Ophthalmology

Optometry

Oral surgery (dentists only)

Orthopedic surgery

Osteopathic manipulative medicine

Otolaryngology

Pain management

Palliative care

Pathology

Pediatric medicine

Peripheral vascular disease

Physical medicine and rehabilitation

Plastic and reconstructive surgery

Podiatry

Preventive medicine

Primary care

Psychiatry

Public and/or population health

Pulmonary disease

Pulmonology

Radiation oncology

Rheumatology

Sleep medicine

Sports medicine

Surgical oncology

Thoracic surgery

Urology

Vascular surgery
Other
(enter in Row 097)





Send any questions to [email protected]



According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information collection is 0938-1314 (Expiration date: 01/31/2025). The time required to complete this information collection is estimated to average 3.5 hours per response, including the time to review instructions, search existing data resources, gather the data needed, and complete and review the information collection. If you have comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: CMS, 7500 Security Boulevard, Attn: PRA Reports Clearance Officer, Mail Stop C4-26-05, Baltimore, Maryland 21244-1850. ****CMS Disclosure**** Please do not send applications, claims, payments, medical records or any documents containing sensitive information to the PRA Reports Clearance Office. Please note that any correspondence not pertaining to the information collection burden approved under the associated OMB control number listed on this form will not be reviewed, forwarded, or retained. If you have questions or concerns regarding where to submit your documents, please contact QPP at [email protected]



2023 CMS MERIT DATA TEMPLATE 25 04/05//2023

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy