Supporting Statement – Part A CMS-10706
Generic Clearance for the Center for Clinical Standards and Quality IT Product and Support Teams (CMS-10706)
Background
The CMS Center for Clinical Standards and Quality (CCSQ) is responsible for administering appropriate information systems so that the public can submit healthcare-related information. While beneficiaries ultimately benefit, the primary users of CCSQ IT Product and Support Teams (CIPST) are healthcare facility employees and contractors. They are responsible for the collection and submission of appropriate beneficiary data to CMS to receive merit-based compensation.
The generic clearance will allow a rapid response to inform CMS initiatives using a mixture of qualitative and quantitative consumer research strategies (including formative research studies and methodological tests) to improve information systems that serve CMS audiences. CMS implements human-centered methods and activities for the improvement of policies, services, and products. As information systems and technologies are developed or improved upon, they can be tested and evaluated for end-user feedback regarding utility, usability, and desirability. The overall goal is to apply a human-centered engagement model to maximize the extent to which CMS CIPST product teams can gather ongoing feedback from consumers. Feedback helps engineers and designers arrive at better solutions, therefore minimizing the burden on consumers and meeting their needs and goals.
The activities under this clearance involve voluntary engagement with target CIPST users to receive design and research feedback. Voluntary end-users from samples of self-selected customers, as well as convenience samples, with respondents selected either to cover a broad range of customers or to include specific characteristics related to certain products or services. All collection of information under this clearance is for use in both quantitative and qualitative groups collecting data related to human-computer interactions with information system development. We will use the findings to create the highest possible public benefit.
Each of the structured and unstructured activities listed below may use identical questions and may include more than ten respondents. Research that is quantitative in nature requires a larger sample size and would consist of more than ten respondents to provide meaningful results. Each proposed test or activity involving identical questions and requesting responses from more than ten respondents will be submitted for approval separately.
Research activities to inform the utility, usability, and desirability of information system development may include:
Card sorting: Card sorting is a method used for determining information architecture labels and structures to be used in a Website or app and gives insights to understand users’ mental models better.
Cognitive testing: Usually, an interview is essential to producing a compelling product that provides vital insights into how people interpret the language and graphics in your product and how they connect what they see in the report to their own experiences.
Field Studies: Field research, also known as Contextual Inquiry, is conducted in the user's context and location. Learn the unexpected by leaving the office and observing people in their natural environment.
First click tests: First Click Testing examines what a test participant would click first on the interface to complete their intended task. An analysis may be performed on a functioning Website, a prototype, or a wireframe.
Focus Group: A focus group is a moderated discussion that typically involves 5 to 10 participants. Through a focus group, you can learn about users' attitudes, beliefs, desires, and reactions to concepts.
User Interviews: A user interview is a method of gaining information by asking questions that relate to your objectives concerning the user's experience with your product/service.
Usability testing (Moderated and unmoderated): Usability testing includes a range of test and evaluation methods such as automated evaluations, inspection evaluations, operational evaluations, and human performance testing. In a typical performance test, users perform a variety of tasks with a prototype (or an operational system). At the same time, observers note what each user does and says, and performance data is tracked and recorded. One of the primary purposes of usability testing is to identify issues that keep users from meeting the usability goals of a Website. Moderated usability testing is with the active participation of a facilitator; unmoderated usability testing sessions are completed alone by the participant.
Participatory Design: Participatory design is a product design method to design digital services. It involves stakeholders, end-users, and the team in the design process to help ensure that the end-product meets the needs of users (i.e., Customer Journey Workshop, Service Blueprinting).
Survey: A survey is a set of questions used to collect topic-specific information from a representative sample of your target audience. Since surveys can be relatively inexpensive, executed quickly, and gather a broad set of data, they are used to collect information on a wide range of topics.
Tree testing: Tree testing is a usability technique for evaluating the findability of topics on a Website. It's also known as 'reverse card sorting' or 'card-based classification.' Tree testing is done on a simplified text version of your site structure – without the influence of navigation aids and visual design.
Justification
Need and Legal Basis
The Health Information Technology for Economic and Clinical Health (HITECH) Act is part of the American Reinvestment and Recovery Act (ARRA) of 2009. As noted in the HITECH Act, CMS is responsible for defining "meaningful use" of certified electronic health record (EHR) technology and developing incentive payment programs for Medicare and Medicaid providers.
CMS is continually implementing and updating information systems as legislation and requirements change. To support this initiative, CIPST teams must have the capacity for engagement with users in an ongoing variety of research, discovery, and validation activities to create and refine systems that do not place an undue burden on users and instead are efficient, usable, and desirable.
Information Users
The primary CIPST users are not beneficiaries, but rather representatives of healthcare facilities who are responsible for disseminating aggregate healthcare data to CMS. The information collected from voluntary participants by CIPST product team members will help CMS CCSQ inform the design, selection, testing, modification, and expansion of innovative healthcare quality information systems. Because CMS employees, contractors, and partners are responsible for creating and improving human-centered products, services, and policies, by definition, they must engage with and gather regular feedback from the public for whom these products, services, and systems serve.
Some examples of how such research can improve the design and development of products, services, and policies include:
Usability Testing: The rule of thumb is that researchers can identify 80% of usability issues by testing 5-8 end users. However, good human-centered design means that researchers may need to conduct multiple rounds of tests to ensure the product is designed right. Also, if numerous user types use system features, then this too necessitates the need to engage with more than nine people to gain accurate findings.
User Interviews: Human-centered design means designing and developing products, not based on assumptions about the intended users, but regularly engaging with these people about their needs and goals. Therefore, teams need to converse periodically with users at all stages of a project initiative.
Focus Group: The goal of meeting multiple users at the same time can provide the business with ideas of what to build to address customer needs and goals. A common practice is to conduct Joint Application Design (JAD) focus group sessions regularly to receive regular feedback over the duration of design and development.
Use of Information Technology
Both qualitative and quantitative information collection will take place with end-users to determine the utility, usability, and desirability of CCSQ information systems. Research consisting of surveys is mostly quantitative in nature and collected in an electronic format. However, end-user research and design activities rely primarily on qualitative feedback measures, which require remote or in-person synchronous feedback.
Duplication of Efforts
This information collection does not duplicate any other effort, and the information cannot be obtained from any other source.
Small Businesses
We expect the impact on small businesses who are CIPST end users to be minimal, such as observation, shadowing, contextual inquiry, and A/B testing.
Less Frequent Collection
Human-centered design is a design and management framework that develops solutions to problems by involving the human perspective in all steps of the problem-solving process. In the context of CIPST and CMS/CCSQ research, design, and development, the premise is that end-users need engagement throughout each step of the process to meet their needs and goals effectively. Engaging end-users early and often leads to reduced product development burdens and leads to a higher likelihood of developing solutions and designs that are more useful, usable, and desirable. We expect the impact on small businesses who are CIPST end users to be minimal, such as observation, shadowing, contextual inquiry, and A/B testing.
Special Circumstances
There are no special circumstances.
Federal Register/Outside Consultation
The 60-day Federal Register notice published on June 22, 2020 (85 FR 37456). We received no public comments.
The 30-day Federal Register notice published on September 11, 2020 (85 FR 56227).
Payments/Gifts to Respondents
Because participation is entirely voluntary, discretionary compensation by way of small gifts may be used to gain the involvement of a full group of volunteers. For example, provide a $5 Starbucks gift card to thank participants for their time.
Confidentiality
Submissions of information to CMS are public information, and no personal identifying information will be collected. Respondents receive no assurance of confidentiality. Information collected will be used to assist product teams in making informed CIPST design and development decisions.
Sensitive Questions
No questions of a sensitive nature are involved.
Burden Estimates (Hours & Wages)
There are numerous types of collections, as Table 1 - Burden Cost, Collection Types and Activities indicates. The total annual burden hours are 4,957 and total collection burden cost of $425,508.88 for 11,476 respondents. The estimated annual cost burden, including costs to the Federal Government, is $5,310,877.68.
Table 1 - Burden Cost, Collection Types and Activities
Collection Types |
Estimated Annual # of Activities |
Estimated Annual # of Responses |
Time Per Response |
Hour Per Response |
Annual Hour Burden |
Cost Per Response1 |
Annual Cost Burden |
Card Sorting |
2 |
30 |
30 minutes |
0.50 |
15 |
$42.92 |
$1,287.60 |
Focus Groups |
6 |
60 |
2 hours |
2.0 |
120 |
$171.68 |
$10,300.80 |
Usability Testing |
60 |
480 |
1.5 hours |
1.5 |
720 |
$128.76 |
$61,804.80 |
Interviews |
20 |
120 |
1 hour |
1.0 |
120 |
$85.84 |
$10,300.80 |
Contextual Inquiry/Field Studies |
24 |
48 |
2 hours |
2.0 |
96 |
$171.68 |
$8,240.64 |
First Click Tests/ Tree Testing |
2 |
90 |
1 hour |
1.0 |
90 |
$85.84 |
$7,725.60 |
Participatory Design |
81 |
648 |
2 hours |
2.0 |
1,296 |
$171.68 |
$111,248.64 |
Survey |
20 |
10,000 |
15 minutes |
0.25 |
2,500 |
$12.88 |
$214,600.00 |
Collection Burden Totals |
215 |
11,476 |
N/A |
N/A |
4,957 |
N/A |
$425,508.88 |
Estimated Annual Federal Cost2 |
N/A |
N/A |
N/A |
N/A |
N/A |
N/A |
$4,885,369.00 |
Total Annual Cost Burden |
N/A |
N/A |
N/A |
N/A |
N/A |
N/A |
$5,310,877.68 |
Occupation |
Mean Hourly Wage - Base |
Total Compensation - Hourly |
Medical
and Health Services Managers
at |
$54.68 |
$109.36 |
Medical Records and Health Information Technicians at https://www.bls.gov/oes/2018/may/oes292071.htm |
$21.16 |
$42.32 |
Registered
Nurses at |
$36.30 |
$72.60 |
Computer
and Information Research Scientists
at |
$59.54 |
$119.08 |
Average |
-- |
$85.84 |
Capital Costs
No capital costs are associated with this information collection.
Cost to Federal Government
The annual cost to the Federal Government for user experience research is approximately $4,885,369.00 across four CIPST contracts.
This cost includes $2,311,380.00 for researchers and other related contractor roles, and $2,573,990.00 total compensation for 13 GS-12 employees who manage the research contracts and work.
Table 3 - CIPST Contracts
Contract |
Annual Cost |
Contract A |
$670,704.00 |
Contract B |
$1,000,000.00 |
Contract C |
$614,276.00 |
Contract D |
$26,400.00 |
Total |
$2,311,380.00 |
Average |
$577,845.00 |
Table 4 - Wages
GS-12 Employees |
Compensation |
Average salary |
$99,000.00 |
Average total compensation |
$197,999.00 |
Total compensation for 13 employees |
$2,573,990.00 |
Changes to Burden
This is a new information collection.
Publication/Tabulation Dates
The information will not be published. It will be used for internal decision-making purposes.
Expiration Date
CMS will display the expiration date on the collection instruments.
Certification Statement
There is no exception to the certification statement.
1 Cost per participant calculation based upon targeted occupations for respondents (see Table 2 - Burden Cost, Wages). The mean hourly wage is from U.S. Bureau of Labor Statistics, Occupational Employment and Wages, May 2018. The hourly wage was multiplied by two to determine total hourly compensation, and then an average of the four occupation types was determined to be $85.84.
2 The annual cost to the Federal Government includes user experience research across four CIPST contracts. See section 14 for additional information.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement - Part A |
Subject | Generic Clearance for the Center for Clinical Standards and Quality IT Product and Support Teams (CMS-10706) |
Author | CMS |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |