CMS-10706_SuppStmt_Part A_FINAL-2024-02-22

CMS-10706_SuppStmt_Part A_FINAL-2024-02-22.docx

Generic Clearance for the Center for Clinical Standards and Quality IT Product and Support Teams (CMS-10706)

OMB: 0938-1397

Document [docx]
Download: docx | pdf

Supporting Statement – Part A CMS-10706

Supporting Statement – Part A

Generic Clearance for the Center for Clinical Standards and Quality IT Product and Support Teams (CMS-10706)

  1. Background

This is an extension package. The CMS Center for Clinical Standards and Quality (CCSQ) is responsible for administering appropriate information systems so that the public can submit healthcare-related information. While beneficiaries ultimately benefit, the primary users of CCSQ IT Product and Support Teams (CIPST) systems are healthcare facility employees and contractors. They are responsible for the collection and submission of appropriate beneficiary data to CMS to receive merit-based compensation.

The systems that support CCSQ programs includes but is not limited to: End-Stage Renal Disease Quality Reporting System (EQRS), Enterprise Shared Services (ESS), HCQIS ServiceNow (SNOW), Hospital Quality Reporting (HQR), Quality Improvement and Evaluation System (iQIES), Quality Management and Reporting System (QMARS), and Quality Payment Program (QPP).

The generic clearance will allow CMS to gather information to improve information systems that serve CMS audiences. CMS will gather this information using a mixture of qualitative and quantitative consumer research strategies (including formative research studies and methodological tests). CMS implements human-centered methods and activities for the improvement of policies, services, and products. This collection of information is necessary to enable CMS to garner customer and stakeholder feedback in an efficient, timely manner, in accordance with our commitment to improving service delivery.

As information systems and technologies are developed or improved upon, they can be tested and evaluated for end-user feedback regarding utility, usability, and desirability. The overall goal is to apply a human-centered engagement model to maximize the extent to which CIPST can gather ongoing feedback from consumers. Feedback helps engineers and designers arrive at better solutions, therefore minimizing the burden on consumers and meeting their needs and goals.

The activities under this clearance involve voluntary engagement with target CCSQ users to receive design and research feedback. The respondents will be voluntary end-users from self-selected customers, as well as convenience samples. It is our intent that selected respondents will either cover a broad range of customers or include specific characteristics related to certain products or services. All collections of information will allow us to continually refine our processes, systems, and services for the benefit of internal and external stakeholders.

Research activities to inform the utility, usability, and desirability of information system development may include:

  • Card sorting: Card sorting is a method used for determining information architecture labels and structures to be used in a Website or app and gives insights to understand users’ mental models better.

  • Cognitive testing: Usually, an interview is essential to producing a compelling product that provides vital insights into how people interpret the language and graphics in your product and how they connect what they see in the report to their own experiences.

  • Field Studies: Field research, also known as Contextual Inquiry, is conducted in the user's context and location. Learn the unexpected by leaving the office and observing people in their natural environment.

  • First click tests: First Click Testing examines what a test participant would click first on the interface to complete their intended task. An analysis may be performed on a functioning Website, a prototype, or a wireframe.

  • Focus Group: A focus group is a moderated discussion that typically involves 5 to 10 participants. Through a focus group, you can learn about users' attitudes, beliefs, desires, and reactions to concepts.

  • User Interviews: A user interview is a method of gaining information by asking questions that relate to your objectives concerning the user's experience with your product/service.

  • Usability testing (Moderated and unmoderated): Usability testing includes a range of test and evaluation methods such as automated evaluations, inspection evaluations, operational evaluations, and human performance testing. In a typical performance test, users perform a variety of tasks with a prototype (or an operational system). At the same time, observers note what each user does and says, and performance data is tracked and recorded. One of the primary purposes of usability testing is to identify issues that keep users from meeting the usability goals of a Website. Moderated usability testing is with the active participation of a facilitator; unmoderated usability testing sessions are completed alone by the participant.

  • Participatory Design: Participatory design is a product design method to design digital services. It involves stakeholders, end-users, and the team in the design process to help ensure that the end-product meets the needs of users (i.e., Customer Journey Workshop, Service Blueprinting).

  • Survey: A survey is a set of questions used to collect topic-specific information from a representative sample of your target audience. Since surveys can be relatively inexpensive, executed quickly, and gather a broad set of data, they are used to collect information on a wide range of topics.

  • Tree testing: Tree testing is a usability technique for evaluating the findability of topics on a Website. It's also known as 'reverse card sorting' or 'card-based classification.' Tree testing is done on a simplified text version of your site structure – without the influence of navigation aids and visual design.

  1. Justification

  1. Need and Legal Basis

CMS continues to align with an initiative to reduce burden by collaborating with its customers to ease the burden currently placed on healthcare clinicians and providers through continuous improvements and customer engagements with the communities for which we serve. CMS intends to continue gathering and synthesizing customer input and develop recommendations for ways to which the agency can reduce burdens by improving its product and services.

CMS is continually implementing and updating information systems as legislation and requirements change. To support this initiative, CIPST must have the capacity for engagement with users in an ongoing variety of research, discovery, and validation activities to create and refine systems that do not place an undue burden on users and instead are efficient, usable, and desirable.

For example, the QPP team takes a human-centered design approach to understand those in the clinician community to create programs and services. This understanding and ongoing feedback from the clinician community is crucial to designing and prioritizing the most compelling features for their customers. For example, during a study, QPP will gather customer feedback about an existing design, new concept, or interactive prototype. They may also explore a customer’s usage of a feature to understand any pain points or gather perspectives on a relevant topic. In addition, prior research by the QPP team has improved the QPP digital experience, like easier submission and actionable feedback. More targeted improvements will benefit from more robust information collections.

Some additional examples of how such research can improve the design and development of products, services, and policies include:

  • Usability Testing: The rule of thumb is that researchers can identify 80% of usability issues by testing 5-8 end users. However, good human-centered design means that researchers may need to conduct multiple rounds of tests to ensure the product is designed right. Also, if numerous user types use system features, then this too necessitates the need to engage with more than nine people to gain accurate findings.

  • User Interviews: Human-centered design means designing and developing products, not based on assumptions about the intended users, but regularly engaging with these people about their needs and goals. Therefore, teams need to converse periodically with users at all stages of a project initiative.

  • Focus Group: The goal of meeting multiple users at the same time can provide the business with ideas of what to build to address customer needs and goals. A common practice is to conduct Joint Application Design (JAD) focus group sessions regularly to receive regular feedback over the duration of design and development.


Any collections under this generic clearance must meet the following criteria: the collection is voluntary; the collection is low-burden for respondents (based on considerations of total burden hours, total number of respondents, or burden-hours per respondent) and low-cost for the Federal Government; the collection is non-controversial and does not raise issues of concern to other federal agencies; the results are not intended to be disseminated to the public; information gathered will not be used for the purpose of substantially informing influential policy decisions; and the collection is targeted to the solicitation of opinions from respondents who have experience with the program or may have experience with the program in the future.



  1. Information Users

The information collected from voluntary participants will help CMS CCSQ inform the design, selection, testing, modification, and expansion of innovative healthcare quality information systems. For example, EQRS is a series of applications supporting the Medicare End-Stage Renal Disease Program, iQIES is CMS’s major tracking, analysis, and data repository for provider quality of care, and QPP collects and scores data, and provides feedback to clinicians.

CCSQ serves as the focal point for all quality, clinical, medical science issues, survey and certification, and policies for CMS programs. Because CMS employees, contractors, and partners are responsible for creating and improving human-centered products, services, and policies, by definition, they must engage with and gather regular feedback from the public for whom these products, services, and systems serve.


  1. Use of Information Technology

Research consisting of surveys is mostly quantitative in nature and collected in an electronic format. However, end-user research and design activities rely primarily on qualitative feedback measures, which require remote or in-person synchronous feedback. Therefore, if appropriate, CMS will collect information electronically and/or use online collaboration tools to reduce burden. For example, advances in communication technologies now allow for online collaboration to interact with respondents instead of an in-person meeting. Furthermore, advances in usability research platforms feature skipping logic so that we can redirect respondents to specific tasks or questions based on the answers for a previous question which reduces the burden.

  1. Duplication of Efforts

No similar data are gathered or maintained by CIPST or are available from other sources to CMS. The information collections will employ human centered design best practices and principles. Human-centered design is an intentional process and considers the customer’s needs, motivations, and limitations using a product or service. The information collections will empower CIPST to immerse themselves in the customer’s experience to gain empathy, understand the customer’s context, elicit feedback, and test prototypes to deliver products and services that improve the public trust.

  1. Small Businesses

Small businesses or other small entities may be involved in these efforts. However, CIPST will minimize the burden on them of information collections approved under this clearance by sampling, asking for readily available information, and using short, easy-to-complete information collection instruments. If we need to conduct interviews with small businesses or entities, we will schedule interviews conveniently to minimize the disruption of daily activities.

  1. Less Frequent Collection

Human-centered design is a design and management framework that develops solutions to problems by involving the human perspective in all steps of the problem-solving process. In the context of CIPST and CMS/CCSQ research, design, and development, the premise is that end-users need engagement throughout each step of the process to meet their needs and goals effectively. Engaging end-users early and often leads to reduced product development burdens and leads to a higher likelihood of developing solutions and designs that are more useful, usable, and desirable.

  1. Special Circumstances

There are no special circumstances.

  1. Federal Register/Outside Consultation

The 60-day Federal Register notice published on February 29, 2024 (89 FR 14846). There were no public comments.

The 30-day Federal Register notice published on May 8, 2024 (89 FR 38901).

  1. Payments/Gifts to Respondents

Generally, CIPST will not provide payment or other forms of remuneration to respondents of its various forms of collecting information; however, there are exceptions.

There may be instances where we need to include an incentive since paying incentives to test participants is standard practice in research and usability testing. For example, because participation is voluntary and requires that people give their time away for free, a token of appreciation may be necessary to gain the involvement of a good group of volunteers. In this instance, a nominal cost gift card to thank participants for their time. If information collections require an incentive, CMS will provide OMB with additional justifications in the request for clearance of these activities.

  1. Confidentiality

CMS pledges privacy to the extent provided by law.

  1. Sensitive Questions

No questions of a sensitive nature are involved.

  1. Burden Estimates (Hours & Wages)

There are numerous types of collections, as Table 1 - Burden Cost, Collection Types and Activities indicates. The total annual burden hours are 17,850 and total collection burden cost of $1,819.182.75 for 54,750 responses. The estimated annual cost burden, including costs to the Federal Government, is $13,624,528.

Table 1 - Burden Cost, Collection Types and Activities

Collection Types

Estimated Annual # of Activities

Estimated Annual # of Responses

Time Per Response

Hour Per Response

Annual Hour Burden

Cost Per Response1

Annual Cost Burden

Card Sorting

4

60

1 hour

1.0

60

$101.92

$6114.90

Focus Groups

6

60

2 hours

2.0

120

$203.83

$12,229.80

Usability Testing

60

600

1.5 hours

1.5

900

$152.87

$91,723.50

Interviews

20

200

1 hour

1.0

200

$101.92

$20,383.00

Contextual Inquiry/Field Studies

24

240

2 hours

2.0

480

$203.83

$48,919.20

First Click Tests/ Tree Testing

2

90

1 hour

1.0

90

$101.92

$9,172.35

Participatory Design

350

3500

1 hour

1.0

3,500

$101.92

$356,702.50

Survey

20

50,000

15 minutes

0.25

12,500

$25.48

$1,273,937.50

Collection Burden Totals

486

54,750

N/A

N/A

17,850

N/A

$1,819,183

Estimated Annual Federal Cost2

N/A

N/A

N/A

N/A

N/A

N/A

$11,805,345.00

Total Annual Cost Burden

N/A

N/A

N/A

N/A

N/A

N/A

$13,624,528


Table 2 - Burden Cost, Wages

Occupation

Mean Hourly Wage - Base

Total Compensation - Hourly

Medical and Health Services Managers at https://www.bls.gov/oes/current/oes119111.htm (2022)

$61.53

$123.06

Medical Records Specialists at https://www.bls.gov/oes/current/oes292072.htm (2022)

$24.56

$49.12

Registered Nurses at https://www.bls.gov/oes/current/oes291141.htm (2022)

$42.80

$85.60

Computer and Information Research Scientists at https://www.bls.gov/oes/current/oes151221.htm (2022)

$74.94

$149.88

Average

--

$101.92


  1. Capital Costs

No capital costs are associated with this information collection.

  1. Cost to Federal Government

The annual cost to the Federal Government for user experience research is approximately $11,805,345 across five CIPST contracts.

This cost includes $9,146,000 for researchers and other related contractor roles, and $2,659,345 total compensation for 13 GS-13 employees who manage the research contracts and work.

Table 3 - CIPST Contracts

Contract

Annual Cost

Contract A

$1,800,000.00

Contract B

$3,816,000.00

Contract C

$530,000.00

Contract D

$2,000,000.00

Contract E

$1,000,000.00

Total

$9,146,000.00

Average

$1,829,200.00


Table 4 - Wages

GS-13 Employees

Compensation

Average salary (2024)

$102,282.50

Average total compensation

$204,565.00

Total compensation for 13 employees

$2,659,345.00


  1. Changes to Burden

This is an information collection renewal for CMS-10706. Changes reflect updated compensation information for public burden costs and federal employee salaries and expected contractor costs. Increases to proposed survey activities reflect a desire for researchers to gather regular feedback from end users about product and service solutions. Changes to other research activities reflect expected increased responses.

CMS is requesting 17,850 burden hours in this extension to its current generic clearance. The total number of burden hours already approved 4,957 has been reduced by 1,901-approved GenICs as follows:

GenIC Title

Burden Hours Deducted From Total

(CMS-10804) QPP Site Intercept Survey

100

Center for Clinical Standards and Quality (CCSQ) Support Central Customer Satisfaction Survey (CMS-10818)

93

Chat/CCSQ Support Central – Basic information questions (CMS-10832)

508

Customer Effort Scoring - Surveys (CMS-10833)

400

Hospital Quality Reporting User Language Card Sort (CMS-10871)

18

Hospital Quality Reporting User Research Participation Interest Survey (CMS-10817 revised)

67

Internet Quality Improvement and Evaluation System (IQIES) HCD User Research Form (CMS-10808)

167

Internet Quality Improvement and Evaluation System (IQIES) Idea Portal

83

QPP Submissions Intercept Survey (CMS-10838)

125

Single Ease Question for User Insights on Hospital Quality Reporting (CMS-10800 revision)

6

CCSQ Support Central Knowledge/Resource Center Survey (CMS-10886)

167

CCSQ Support Central Live Chat and Virtual Agent Enhancement Survey (CMS-10896)

167

Total

1,901


The following GenICs were approved after the 60-day public comment period: CMS-10886 and CMS-10896. There have been 14 GenICs approved; two of them represent revisions. The 12 GenICs are being carried forward.



  1. Publication/Tabulation Dates

The information will not be published. It will be used for internal decision-making purposes.

  1. Expiration Date

CMS will display the expiration date on the collection instruments.

  1. Certification Statement

There is no exception to the certification statement.



1 Cost per participant calculation based upon targeted occupations for respondents (see Table 2 - Burden Cost, Wages). The mean hourly wage is from U.S. Bureau of Labor Statistics, Occupational Employment and Wages, May 2022. The hourly wage was multiplied by two to determine total hourly compensation, and then an average of the four occupation types was determined to be $101.92.

2 The annual cost to the Federal Government includes user experience research across five CIPST contracts. See section 14 for additional information.

6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement - Part A
SubjectGeneric Clearance for the Center for Clinical Standards and Quality IT Product and Support Teams (CMS-10706)
AuthorCMS
File Modified0000-00-00
File Created2024-07-20

© 2024 OMB.report | Privacy Policy