User Experience Self-Assessment Tool v2.0 (CLEAN COPY)

Generic Clearance for the Comprehensive Child Welfare Information System (CCWIS) Review and Technical Assistance Process

User Experience Self-Assessment Tool v2.0 (CLEAN COPY)

OMB: 0970-0568

Document [docx]
Download: docx | pdf

CCWIS Self-Assessment Tool User Experience


OMB # 0970-0568

Expiration Date: 04/30/2024

User Experience Self-Assessment Tool v2.0

Introduction

A CCWIS must be usable to support the administration of the title IV-B and IV-E plans efficiently, economically, and effectively under 45 CFR § 1355.52(a). While user testing and review by a UX professional are not CCWIS requirements, agencies may find these optional efforts help improve user acceptance and overall usability of their CCWIS. The regulations for Comprehensive Child Welfare Information Systems (CCWIS) provide title IV-E agencies an opportunity to build flexible and innovative systems to support programmatic priorities, as well as focus on the needs of end-users, agency partners, and impacted populations. CCWIS projects should utilize user experience (UX) design practices to ensure systems are usable and accessible. Strong UX design moves beyond compliance “box checking” and incorporates an understanding of the diverse individuals who interface directly with the system and the populations impacted by the system’s use. The goal of UX design is to implement user-friendly and easy to understand systems that promote effective child welfare practice. Additionally, considering diversity, equity, inclusion, and accessibility (DEIA) factors provides opportunities to better understand program needs and build innovative solutions.

To implement the Biden-Harris Administration’s equity agenda described in Executive Order 13985, in February 2022, the Administration for Children and Families (ACF) published IM ACF-IM-IOAS-22-01, describing how ACF will encourage its grantees to assess and address how its programs and policies may perpetuate systemic barriers for children and families of color. ACF’s commitment to advancing racial equity and encouraging agencies to assess how programs and policies may perpetuate systemic barriers for children and families of color. The Division of State Systems (DSS) is committed to promoting equity in all aspects of project work and partnering with agencies to involve non-traditional populations in system planning and implementation activities.

Purpose

UX design is an important aspect of providing technology to improve program service delivery. A disciplined UX approach is critical to assisting agencies with translating user wants and needs into intuitive digital experiences. UX ensures that content is clear and in plain language. Additionally, UX aids in developing systems that function logically and easily so that tasks are performed seamlessly. In designing and developing a CCWIS, agencies should engage end-users on a periodic basis to understand program needs, technology impact, and opportunities to improve service delivery.

This self-assessment tool helps guide agencies to improve CCWIS UX by covering the common qualitative guidelines that UX professionals employ when evaluating products, such as websites and applications, referred to as heuristics. This tool is not intended to replace end-user testing or review by a UX professional. Ensuring CCWIS systems assist agencies with managing the child welfare program, providing accurate, timely, and complete data, and administering efficient business processes are all critical aspects of federal requirements that are dependent upon ongoing involvement of end-users and impacted populations. This tool should assist title IV-E agencies in, developing effective and efficient modules, and measuring the inclusion and impact of UX- and DEIA-informed design in the project. In terms of project management, this tool can assist in identifying who to include in testing and other development activities and ensure adequate collaboration among different groups.

Important Terms

This section establishes key terms and definitions to promote a shared language and understanding of usability and equity concepts. To see preferred terms for different groups, please see CDC guide: Preferred Terms for Select Population Groups & Communities | Gateway to Health Communication | CDC.

  • End-user Refers to those who directly interact with the CCWIS and may include:

    • Agency Staff (field and administrative levels)

    • Child Welfare Contributing Agencies and providers

    • Youth and families (

    • General public if external dashboards or portals are accessible

  • Impacted populations – Refers to individuals and groups who may be or are involved with the child welfare agency and whose information is collected and stored in the CCWIS.

  • PartnersRefers to any persons or organizations which have a special interest in the success of the CCWIS. One example of a partner could be the foster care ombudsmen of an agency.

  • Equity – The consistent and systematic fair, just, and impartial treatment of all individuals1.

  • Accessibility – How easy it is for people with and without disabilities (visual, hearing, intellectual, learning, etc.) to access and consume information and complete tasks within a system. It applies to every element of a digital product, including images, content, language, links, tables, multimedia, forms, downloads, navigation, and markup.

  • User Experience – The overall experience of a person using the CCWIS, especially related to ease of use and how well it meets user expectations.


Tool Format

This self-assessment tool is divided into sections as outlined in the chart below. Each question and additional consideration have a unique Element # for easy reference. Please refer to the instructions in Technical Bulletin #7 or contact your federal analyst if you have questions about the tool or a specific element.

Section

Element #

Overview – Background Information

J.A.xx

Self-Assessment – Part 1 – Usability

J.B1.xx

Self-Assessment – Part 2 – User Engagement and Equity

J.B2.xx

Resources and Additional Considerations

J.C.xx



A. Overview and Background Information

In the Overview and Background Information section, agencies may collect information on CCWIS end-users and their needs for the CCWIS. This section provides a framework for assessing whether the UX will help end-users complete tasks. Please answer each question fully. If a question is not applicable to the system or module you are evaluating, indicate “NA” and provide a reason.

J.A.01 Self-assessment tool completion date:




J.A.02 Identify the CCWIS functionality and/or module end-user groups and impacted populations user groups. Describe how the CCWIS functionality and/or module is being tested for usability with each user group and what design decisions were affected by user testing. Some examples of testing methods include focus groups, interviews, surveys, and First Click Testing2. Remember that some partners (e.g., product owners) are not stand-ins for end-users.

  • What user groups do you involve in the design, development, testing, and maintenance and operations (M&O) phases? Is there a diverse audience with backgrounds that accurately represent the community impacted?

  • How do you involve user groups in testing? (Remotely or on-site? With every build?)

  • How do you communicate the testing results and resolutions back to the user groups?

  • What process exists for identifying diverse user groups for testing?

If this information is included in a testing plan separate document, you may attach that as well in the box provided below.




J.A.03 Brief description of how UX is being used, or was considered, in the design of the system or module. You may reference and/or attach an existing project document as needed.




J.A.04 Provide any additional comments as background regarding the UX of your agency’s system or module.


In the section below, the agency may document components, factors, and design elements of the function(s) or exchange(s) that support the user experience goals of the CCWIS. If the agency has additional goals, please include them below and add new rows as needed. We encourage agencies to simplify their responses by referencing submitted documentation, such as APDs or attaching design documents and screenshots.

Please answer each question fully. If a question is not applicable, indicate “NA” and provide a reason.

Part 1 – Usability

Usability refers to how easy it is for people to understand and complete tasks using a CCWIS. Specifically, usability assesses how well end-user research and the subsequent UX/user interface (UI)3 design process achieves the intended outcomes for the organization and its end-users. A system or module should allow the end-user to access necessary information and complete tasks efficiently and without frustration, excessive clicks/navigation, or workarounds.

#

Usability Goal

Evidence that the System/Module Supports the Goal

J.B1.01

Project utilizes Human-Centered Design approaches and incorporates partners, end-users, and impacted populations in system development and ongoing support activities.

Evidence may include but is not limited to: Human-Centered Design professionals on the project team, documentation of consistent consideration of UX approaches in user stories or software development life cycle, documentation of active and frequent user group meetings/communications, survey documentation on the impact of functionality/system adoption, documentation of enhancements introduced to respond to end-user needs/issues.

J.B1.02

Textual and visual cues (e.g., breadcrumbs) help end-users orient themselves within the structure of the module or system. Design decisions are made with consideration of all users, including those of different abilities, such as neurodivergent populations.

Evidence may include but is not limited to: The layout and design of the system (user stories, screens, programming standards) are documented and consistently utilized. The flow and design should reflect an understanding of the business need and should not have unnecessary clicks, screen changes, constant saving, or lost work items. Options exist to modify notification sounds and frequency, as well as screen zoom and font size. Hovers should provide additional information and case ID/person ID should utilize names. Automation processes should align with end-user’s needs and policy. Documentation may include survey data, system performance data, screen shots, user stories, and/or content/display model.

J.B1.03

Menus and navigational elements are consistent and predictable throughout. Interactive features related to navigating the system or module perform as the end-user expects.

Evidence may include but is not limited to: System reflects business practice workflows and supports the efficient and effective flow of activities. Role-based differences are reflected in common areas such as visits, activity logs, placements, case plans, eligibility, and payments. Chronological views exist, and links to access frequently needed data are strategically placed throughout common areas for easy use. Hovers, screen expansion, scrolling, and filtering may assist end-users with workflows. Documentation may include user stories, screen shots, survey data, and automated performance data.

J.B1.04

Buttons, menus, forms, labels, and similar design elements are meaningful, unambiguous, and mutually exclusive. End-users can predict the function based on the element. Essential tasks, functions, and content are prioritized in design templates.

Evidence may include (isn’t limited to): The design elements should be intuitive and easy for the user to utilize without delaying data entry or completing a business process. Menus should expand and collapse with appropriate timing. Documentation of essential tasks/most used functions based on system performance data/end-user feedback and flexibility/filters/help reflect alignment with priority and use. For example, notes/visits/activity logs are heavily utilized by end-users so features to assist users in understanding events in a chronological order, scrolling from one note to another to understand the story of the case, spell/grammar check, auto save, expanded windows, hover bubbles, etc. will assist end-users in easily navigating this heavily used function. Other heavily utilized functions include case history, allegation history, placement history, case plans etc. Documentation may include screen shots, content/design model standards, end-user feedback from survey/interviews/meetings etc., system demonstration.

J.B1.05

Layout and presentation are consistent throughout and are available for reuse when new content or features are added. Content can be modified when changes occur in agency or federal regulations.

Evidence may include (isn’t limited to): Separate rules engine and documentation to reflect heavily used features/frequently changing policies so that the design supports configuration of commonly changing values such as rates, time frames, report values/displays, exchanges, user profiles, service types, demographic indicators etc. Documentation of user/deployment notes, user stories, screen shots, development standards, enhancement changes supported with configuration rather than customization when appropriate. Screenshots should show all headings being descriptive and applied consistently. Screenshots and demonstrations should show that the CCWIS has organized information in the system with a defined taxonomy and content is appropriately tagged to support end-users in finding what they need.

J.B1.06

End-user, and partner preference does not create orphaned or duplicate data/referential integrity issues (such as placement data captured in multiple parts of the system without links/consistency).

Evidence may include (isn’t limited to): System demonstration, development standards, content/data model, design standards, separate rules engine, automated tools to limit duplication, standards for naming/decommissioning tables in the application etc.

J.B1.07

The agency’s governance plan includes processes for actively measuring and improving the user experience by soliciting and prioritizing end-user, partner, and impacted populations’ feedback during all stages of development. The design plan is adaptive to individual needs. The agency utilizes automation, observation and/or other methods to assess functionality impact on business processes/end-user system adoption.

Evidence may include (isn’t limited to): Documentation of methods to collect and utilize end-user feedback regularly in all phases of project support (planning, development, post implementation) such as: end-user meetings, survey data, automation to monitor data entry timeliness, rules to support or guide accurate and complete data entry, online help that is dynamic and changes as system evolves, performance data on system time outs/error messaging and enhancements to improve usability and performance. Design includes separate rules engine and documented data model to understand relationships between functions, design standards document strategies to ensure the most used functions are designed to support efficient business process (doesn’t add unnecessary time or inaccuracies) to complete workflows. May include governance documentation of minutes/plans and data on measuring baseline data and improvement over time. Helpdesk ticket logs and their resolution process or policies, metrics on how frequently training materials are utilized, longitudinal data on the amount of helpdesk tickets and what areas they are for over time alongside a timeline of relevant enhancements.


Part 2 – User Engagement and Equity

Engagement refers to any activity an agency might undertake to collect requirements and feedback from end-users and impacted populations. Successful engagement begins early in the project and involves end-users, partners, and impacted populations by providing them multiple opportunities to engage with the project via a variety of formats. Agencies may engage with end-users and impacted populations individually or through user groups or communities. Agencies may wish to clarify specific user engagement strategies or expectations in Request for Proposal (RFP) and contract documents.

A primary function for CCWIS systems is to collect data. In the data collection process, there are many ways that disparities and stereotypes can be perpetuated. An equity centered design should consider the various ways that data collection can encourage accurate representation and quality analysis to improve outcomes. There are several things that systems design can consider when it comes to enhancing equity such as ensuring systems have specific categories aligned with population needs such as tribal affiliation, national origin, and language spoken, and allowing for narrative text rather than “other”. Sensitivity to utilizing names to identify individuals (rather than only large sized numbers on the screen) are also indicative of a design that understands the business need.

#

User Engagement Goal

Evidence that the System/Module Supports the Goal

J.B2.01

The agency has developed a user engagement plan, which identifies user groups and impacted populations and how the project will engage with them. End-users, partners, and impacted populations are represented in processes, such as feedback, targeted access, review, and development decisions. Representatives may include but not limited to various ages, races, ethnicity, LGBTQI+, families of origin, and foster/adoptive families, and young people with lived experience.

Documentation of a feasibility study or user engagement plan/profile that identifies all the partners that utilize and/or depend upon data captured or generated by the system. User engagement strategies may also be documented in a communication or change management plan. Documentation of end-user meeting notes/videos, end-user/business profile scan to identify parties impacted by or dependent upon program and technology functions. Documentation may include working affiliations with foster youth advisory, foster parent advisory, provider advisory and other community interest groups and methods to share and obtain information. Screen shots of functions specifically targeted to populations/end-users (such as tribes or young people). Documentation of methods to collect and utilize end-user feedback regularly in all phases of project support (planning, development, post implementation) such as: end-user meetings, survey data, automation to monitor data entry timeliness, rules to support or guide accurate and complete data entry, online help that is dynamic and changes as system evolves, performance data on system time outs/error messaging and enhancements to improve usability and performance. Documentation may include system demonstrations, planning or design sessions with different population groups targeted based upon the business need.

J.B2.02

The agency has fostered the development of communities or cohorts of users to engage regularly, with consistent involvement and communication rather than once or twice with no follow-up. This helps to build trust and transparency. There are multiple mechanisms for ongoing dialogue to occur throughout all stages of development and production and feedback is used in an observable way.

Documentation may include but is not limited to sub-working groups such as a provider group, youth advocacy group etc., where communication regularly occurs and input on system development, ease of use, business needs and system effectiveness are discussed. Involvement in data migration efforts and reasonable efforts to ensure demographic data is correct and identified as such should exist. Documentation of a history of enhancements/functions developed and/or improved over time to meet needs of specific interest groups (such as an ombudsman, legal representative, guardian ad litem etc.) DEIA plans or a DEIA advisory group having a significant role in the governance and development of the system.

J.B2.03

The agency maintains a helpdesk that actively notifies end-users of ongoing project activity, as a part of its user engagement plan to consistently engage with end-users.

Documentation may include but isn’t limited to a help desk repository that captures and groups items by need type, end-user type, priority level etc. Documentation of the timeliness of data fix and enhancement requests posted or regularly communicated back to end-users. Documentation may include survey or interview data. Regular reports that are created of the defects and timelines, APD statistics of helpdesk efficiencies and effectiveness.

J.B2.04

The agency reacts to feedback provided by end-users, partners, and impacted populations within a reasonable time. Reasonable should be defined with the consideration of priorities for end-users, and impacted populations.

See J.B2.03, publicly reviewing APD schedules, training incorporates recent or new features because of feedback, user group meetings that provide ongoing status updates. End-users, partners, and impacted populations provide feedback on work in progress, such as wireframes, demos, prototypes, or proofs of concept. There is a process to incorporate feedback to improve on project goals. Documentation of planful design sessions and development methods that are capable of understanding system adoption/use and adapting to improve usability and responsiveness to program business needs.

J.B2.05

All functionality is tested by end-users during the user acceptance testing (UAT) phase, and end-users that participate are trained on how to be effective testers.

Documentation of regular testing results from specific user types based on the functionality and use of the enhancements/functions. Documentation of testing training plan, participants, and effectiveness of plan.

J.B2.06

The agency offers end-users, partners, and impacted populations plain language user guides or equivalent documentation on system functionality. Mechanisms exist to assist in the alignment between project and program goals.

Documentation of plain language standards, screen shots reflecting use of plain language, user stories written in plain language, survey data/observation/automation to understand user behavior and system adoption/understanding.

J.B2.07

End-users, partners, and impacted populations are aware of best channels to communicate feedback and be informed about how their feedback is used in system planning and development.

Documentation of varying communication strategies to involve populations in all aspects of project phases and follow up methods to describe how feedback is regularly incorporated into project plans and system design/implementation.

J.B2.08

Systems have automated reports and functions to assess usability and equity.

Documentation may include but isn’t limited to: reports showing access, time outs, error messages and time to correct, functions targeted to specific populations/uses, data reports and methods sensitive to factors impacting equity (poverty, health, ethnicity etc.).

J.B2.09

Impacted populations have an active role in their case planning, and the system aligns with that policy.

Strong evidence would be that policies exist that allow for access and that the system reinforces those policies. Policies to include documenting and accessing data.

J.B2.10

A process exists to identify the various individuals impacted by CCWIS or child welfare and ensuring that there is consent, transparency, and data integrity.

Examples might be data and reports generated by the CCWIS that provide demographic information to the end-users, copies of external communication from the agency to their communities or partners about important things happening with their CCWIS. Screenshots of automated consent features if available. System provides data reports with the ability to identify trends and factors that contribute to the success of populations.

J.B2.11

If system functionality includes artificial intelligence (AI) and/or predictive analytics algorithms, the functionality is publicly accessible, visible to appropriate groups, and be able to be tested. A process exists for providing feedback and ensuring data integrity.

Examples of this feedback could be determining the outcomes being predicted, determining acceptable thresholds for false positives and negatives, and classification thresholds for high and low risk classifications. Algorithms should be visible within the system, evidence of the flexibility for AI or predictive analytics to be changed is included.

J.B2.12

A system of consent is developed to govern how information will be shared safely and for appropriate uses.

Data sharing agreements, screenshots of automated consent features, screenshots of alerts or reports that manage renewals and validity of these agreements and consent.

J.B2.13

The agency has data policies that are periodically updated to support evidence-based policy making. An ongoing cycle exists where data and analytics are used to improve and create policies to better outcomes for youth and families.

Data Quality Plan (DQP) aligns to your data policies and demonstrates an alignment of system functionality with agency policy and best practice. Documentation of enhancements because of policy or practice change could be used as evidence. You may also cross reference the reporting tool or data quality tool. Additionally, evidence could include equity plans.


J.B2.14

Impacted populations have access to confirm, deny, or appeal data or a process exists for their needs/interests to be considered and included when concerns arise surrounding the validity of data pertaining to them.

Policies for data correction, system maintains a history of changes to data in key areas, limited access portals, Data and IT Systems are included in a client’s “bill of rights” to include rights about the data that is collected about them, the rights they have to view that information and provide feedback about it, the rights they have concerning how the data is used, and how the data about them is secured. The public has access to analytical data for transparency, or a process is in place to support or encourage outside analysis of data.

J.B2.15

A process exists to align with agency and federal equity program goals. This process should consider the different populations served and include impacted populations, when possible, in these decisions.

Screenshots, reports, service utilization reports by demographic populations. One example could be that drop down menus are inclusive and consider a wide range of racial and ethnic categories (such as: Middle Eastern and North African heritage and subgroups of Asian American, Native Hawaiian and Pacific Islanders) to understand needs, trends, and outcomes. Engaging with impacted populations to determine solutions when mapping misalignment occurs could also be evidence.

J.B2.16

The agency has a policy to ensure that when applicable, content is translated through automated translation functionality is reviewed/accurate.

Including native speakers in their review of the functionality, functionality alignment with agency policy, screenshots of translation access in the system, visibility and or documentation that describes how translation of content occurs.

The resources below provide examples of user experience features, and additional considerations that agencies may consider when designing a CCWIS.

Resource 1 – Usability

Resource 2 – Web Design

Resource 3 – User Engagement and Equity

Resource 4 – Content-First Design and Content Models

Resource 5 – Plain Language

Resource 6 – Administration for Children and Families Digital Toolbox

Resource 7 – Accessibility

Resource 8 – Additional Considerations





Resource 1 – Usability

Good UX design involves user inclusion and participation. At a minimum, technology product producers should involve end-users, partners, and impacted populations in the prototyping phase to understand what to design and how to design it. They should then test the product for usability before launching it to the wider user base. A system design that includes understandable, well-documented design patterns will be more usable and easier to develop. Documenting design styles in a pattern library or front-end style guide supports the reusability and sharing of modules.

Usability.Gov

To support a user-friendly design approach to government websites, the federal government developed usability.gov to provide templates, guidelines, methods, and strategies for web content creation.

  • Usability
    Usability.gov
    This link navigates to the usability.gov home page. The site provides guides on user-centered design, usability testing, content strategy, and related subjects.

10 Usability Heuristics for User Interface Design

The Nielson Norman Group (NN/g) is the UX industry’s recognized leader in research about usability and related topics.



Resource 2 – Web Design

U.S. Web Design System (USWDS) 2.0

The federal government provides a design system intended to help people more easily build accessible, mobile-friendly websites. Although it is intended for federal agencies, much of the system is in the public domain4 and it provides a good example of how to document repeatable design components, styles, and templates.

  • Web Design
    https://designsystem.digital.gov/
    This link navigates to the U.S. Web Design System (USWDS) 2.0 home page. This site provides examples of mobile-friendly sites and the principles behind them and resources for further reading.



Resource 3 – User Engagement and Equity

Methods

Successful user engagement efforts typically use a mixture of passive and active formats. Such efforts tailor the format used to the needs of the end-users, partners, and impacted populations.

Agencies that actively solicit feedback from their users—and demonstrate that they act on the feedback they receive—create trust with their users. A relationship built on this trust creates opportunities for collaborative work between the users and the agency to improve the system, so it better meets the needs of end-users, partners, and impacted populations.

Passive forms of user engagement allow users, on their time, to inform the project team of their needs and are generally open to any feedback that the user desires to give. Such forms may include formats such as a:

  • Help desk

  • Website forum

  • Web portal

  • User feedback application

  • Feedback button within the system

  • Dedicated feedback email address

  • Usability and accessibility report

Active forms of user engagement are initiated by the project team and actively solicit feedback, which is generally more focused than passive forms of feedback. Such forms may include activities such as:

  • Joint application development (JAD) sessions

  • Surveys

  • Listservs

  • Focus groups

  • Interviews

  • User acceptance testing (UAT)

Additional things to consider when determining user groups and advisory boards include:

  • How is membership determined?

  • Can various groups request to be members, and are advisory group meetings open to the public?

  • What roles are there in the advisory group, and how will the group ensure it consistently operates with an equitable lens?

  • If there is a provider-resource home group(s), how is membership determined? If such a group does not exist, is there another method used to obtain and make use of critical feedback about the agency that might be used to improve the use, data quality, and reporting from the IT system?

What is the method of accountability for when challenges or complaints arise regarding DEIA within the IT system?



The goal of a CCWIS built with an equity centered system design is to be a part of a systemic framework aimed at addressing the complex disparities within the child welfare system. As such, it should involve active and ongoing participation of end-users, impacted populations, and partners throughout the entire life of the CCWIS project. The question of what matters to these groups is one that should be consistently asked, and the answers should be consistently incorporated within the project. What might matter to impacted populations, for example, is that the information collected reflects them and their needs accurately.

Something to consider when beginning to design an equity centered CCWIS solution, is possible bias that could occur from the design. Implicit bias refers to ideas, attitudes, thoughts, etc. about a particular group of people that occurs automatically, outside of the conscious brain, but that still impacts behaviors, and decisions. It is crucial to determine any implicit biases that may exist within your project team so that you can focus designing your CCWIS with those groups of individuals involved consistently. Some areas to consider when thinking about implicit bias include what subliminal messages may be associated with the structure of data fields, what associations are made between different data fields and do these associations impact decision making?

The need for transparency is crucial. This includes seeking public input in the plans for building a model, the problem it seeks to address/use case, and other aspects. This article speaks to the importance of engaging community at all stages.

Another area central to equity and where many suggest that community engagement is important is on the methods to ensure fairness. Some more articles that speak to this and related aspects include:

To prioritize equity within your system design, it is important to understand the impacts that data and systems design has on perpetuating inequities. The articles below offer a more in-depth look:

Even the language used within a CCWIS can play an important role in improving representation and reducing implicit bias. ACF has developed the following resource to address this topic: Language Bias in Child Welfare | The Administration for Children and Families (hhs.gov).



Resource 4 – Content-First Design and Content Models

Content-First Design

Content-first design, meaning designing around content before making technology decisions, is a method that may support the end-user experience and developing a successful IT system. Structuring content by type and giving each type, well-defined attributes help the owning organization have flexible content that can be device and platform neutral. Because content is not locked into a specific format or system but developed within a defined system, it is more modular and able to be shared and reused on multiple platforms or by many organizations.

Content Models

Content Models are a way to organize content. The W3C describes content models and provides a helpful diagram to show how each model fits together.



Resource 5 – Plain Language

Plain Language

The U.S. Government provides guidance on writing in plain language.

  • Plain Language
    https://www.plainlanguage.gov/
    This link navigates to the plainlanguage.gov home page. The website explains and provides examples of plain language.

  • Flesch Kincaid Grade Level Test

https://support.microsoft.com/en-us/office/get-your-document-s-readability-and-level-statistics-85b4969e-e80a-4777-8dd3-f7fc3c8b3fd2?ui=en-us&rs=en-us&ad=us

This link navigates to the Microsoft support page. The website explains how to utilize the Flesch Kincaid grade level test and Flesch ease of reading test on a word document.



Resource 6 – Administration for Children and Families Digital Toolbox

Administration for Children and Families Digital Toolbox

The Administration for Children and Families (ACF) created the Digital Toolbox to aid ACF federal offices and bureaus in developing web products.

  • ACF Digital Toolbox
    https://www.acf.hhs.gov/digital-toolbox
    This link navigates to the ACF digital toolbox, which includes information on content, user experience, accessibility, plain language, UX, visual design, and other topics.



Resource 7 – Accessibility

Accessibility should be consistent with the needs of the users you identify. One example is bionic reading screen views for neurodivergent populations. An understanding of the language and service needs of different user groups and/or individuals impacted by the system is important. A key proactive step for those designing an equity centered system is having a basic understanding of assistive technologies.

Accessibility Approach

The end-users who will be utilizing the CCWIS are diverse and have unique needs. Meeting these individual needs is crucial to the efficiency of a CCWIS and will improve practice outcomes. For example, end-users with visual impairments may benefit from having options for larger font sizes. Describe the approaches used to make the system or module accessible by people who are experiencing disabilities including hearing, vision, motor, intellectual, learning, or developmental abilities. Approaches could include the Section 508 requirements of the Rehabilitation Act5.

Accessibility Information 

Approach Used 

Standard and level of accessibility used for assessment.

 

Assistive technologies used for testing the system.

 

Type of audit conducted on system (manual or automated); if automated, what tool(s) were used?

 

Section 508

The U.S. General Services Administration runs Section508.gov, which is the U.S. Government-wide IT accessibility program.

  • Section 508
    https://www.section508.gov/
    This link navigates to the U.S. General Services Administration’s site for Section 508 guidance. The site assists with developing and testing content for Section 508 compliance and provides Section 508 tools and training.

Web Accessibility Initiative

The W3C is the globally recognized source for web accessibility standards and provides aids to meet WCAG guidelines.

  • W3C
    https://www.w3.org/WAI/standards-guidelines/
    This link navigates to the Web Accessibility Initiative’s W3C Accessibility Standards Overview. The page offers a collection of guidelines with links to their descriptions and information for further reading.



Web Content Accessibility Guidelines (WCAG) 2.0

Conforming to WCAG 2.0 standards is not a requirement of CCWIS, and, as with any such standard, is not endorsed by CB. It is an industry-wide standard that agencies may adopt when developing their CCWIS. The agency must have documented standard(s) that promote efficient, economical, and effective development of a reliable system6.

The WCAG 2.07 Principles are:

  • Perceivable: Users should be able to learn of the information through their own senses.

  • Operable: Users should be able to interface and navigate throughout the website.

  • Understandable: Users should be able to understand and work within the interface.

  • Robust: Users should be able to use accessible content as technology changes.

The WCAG 2.0 standards have three levels of conformance:

  • Level A: For Level A conformance (the minimum level of conformance), the web page satisfies all the Level A Success Criteria or a conforming alternate version is provided.

  • Level AA: For Level AA conformance, the web page satisfies all the Level A and Level AA Success Criteria, or a Level AA conforming alternate version is provided.

  • Level AAA: For Level AAA conformance, the web page satisfies all the Level A, Level AA, and Level AAA Success Criteria or a Level AAA conforming alternate version is provided.

Conformance to the WCAG 2.0 standards and the levels of conformance within the 2.0 standards may be measured by a web accessibility audit. Such audits determine whether a level of conformance has been met. Agencies may perform their own audits8 or may procure the services of a vendor to perform this task.



Resource 8 – Additional Considerations

The Additional Considerations section describes useful features an agency may wish to incorporate into the CCWIS to support a good user experience. If the agency is including these additional considerations in the CCWIS, please write “Yes” in the “Included in Agency’s CCWIS?” column.


#

Included in Agency CCWIS?

Additional Considerations

J.C5.01


The agency designs new content to contain attributes that map to a content type defined in the model.

J.C5.02


Information in the system matches the user’s mental model.9

J.C5.03


Content uses simple typography.10

J.C5.04


Information is chunked (grouped) and end-users are guided through inputting any required data in easily consumable sections.

J.C5.05


Color and other visual design elements add clarity, hierarchy, and enhance absorption of content but the system is still usable and accessible if the user cannot perceive these elements (e.g., a user with color blindness).

J.C5.06


The most common screen sizes, devices, and browsers of the intended user base are supported and provide a good user experience (for example, text and graphics are readable and responsive on a mobile device, touch targets are optimized for responsive layout, and content is constrained to proper reading widths on tablets).

J.C5.07


Color contrast for all elements meets WCAG 2.0 guidelines.

J.C5.08


The system uses applications such as the Accessible Rich Internet Application (ARIA) markup11 to identify significant page components.

J.C5.09


The agency has a documented plan for conducting accessibility audits regularly (including for broken links), especially when new content or features are deployed.

J.C5.10


The system provides screen and field level help to assist end-users.

J.C5.11


Content is structured using a content model that defines each needed content type and its attributes.12

J.C5.12


Content is developed in a format that allows it to be easily shared.

J.C5.13


Content is clear, concise, and quickly reviewable, following agency best practices. Content may follow industry best practices, such as web writing best practices, and federal plain language guidelines.13

J.C5.14


Content is written in a task-based, action-oriented manner. End-users are instructed based on instructions, labels, tool tips, and other cues, such as field level and page level help, as to what to do.

J.C5.15


User interface (UI) is free of unnecessary decorative elements and suits the design of the system.

J.C5.16


Alternative text is provided for all relevant images, so that screen reader users can understand the key information the image contains and conveys. For images including imbedded text, that text is also included in the alt text description. When an image is used purely for decoration, the alt text is left empty, so screen reader users are not distracted from key information.

J.C5.17


Headings are used, in the correct order, to indicate and organize content structure so screen reader users can easily navigate and interpret the webpage. Style elements are separated from structure by using CSS (Cascading Style Sheets).

J.C5.18


Content is in plain language and written at an appropriate level for all user groups.

J.C5.19


Links are meaningful and clearly defined as links, with unique and descriptive naming. The end-user is informed where a link will take them next and clicking links does not cause the user to lose their progress.

J.C5.20


Tables are used to present tabular data, not for designing webpage layout. When it is necessary to use a data table, headers are used for rows and columns to help explain the relationships of cells.

J.C5.21


All multimedia elements provide alternatives such as captions and transcripts or audio-described versions.

J.C5.22


All system content can be accessed using a keyboard alone, to accommodate Users with mobility disabilities who may not be able to manipulate a mouse or trackpad. Content is available across multiple devices.

J.C5.23


The tab order matches the visual order, so keyboard-only users can logically navigate site content. A "Skip to main content" option is provided at the top of each page, to allow for quick navigation for keyboard-only users. The system does not use elements that require users to hover over items with a mouse to activate.

J.C5.24


The page has appropriate color contrast. To identify required fields, color is used with other visual indicators, such as an asterisk or question mark.

J.C5.25


Blocks of content are distinguished from one another using visual separation (such as whitespace or borders).

J.C5.26


All form fields have well-positioned, descriptive labels, using a <label> tag or an ARIA property to link the label text with the form field. Related or similar fields are grouped together using field sets. Required form fields are labeled appropriately and configured to alert screen reader users. Asterisks (or similar visual indications) are also used for sighted users, people with learning disabilities, or people who speak English as a second language.

J.C5.27


All downloadable content is accessible.

J.C5.28


The agency has a documented plan for including accessibility in the development process for any future enhancements, rather than only testing for accessibility at the end of design and development.



We encourage agencies to add examples of additional considerations from their CCWIS function(s) and feature(s) not on the list above they wish to highlight.

#

Agency-Submitted Additional Considerations











1 EO 13985 2021, 7009

2 First Click Testing examines what a test participant would click on first on the interface to complete their intended task. See: https://www.usability.gov/how-to-and-tools/methods/first-click-testing.html

3 For more information about user interface design, please refer to the additional resources 2, and 4 and web design, and content.

5 https://www.section508.gov/manage/laws-and-policies/

6 See: 45 CFR 1355.53(a)(3).

8 See: https://www.whoisaccessible.com/guidelines/accessibility-audit/


PAPERWORK REDUCTION ACT OF 1995 (Pub. L. 104-13) STATEMENT OF PUBLIC BURDEN: Through this information collection, the Administration for Children and Families (ACF) is collecting information to document that title IV-E agencies have planned and developed their system’s conformity to federal CCWIS and Advance Planning Document requirements. Public reporting burden for this collection of information is estimated to average 10 hours per title IV-E agency choosing to develop and implement a CCWIS system, including the time for reviewing instructions, gathering, and maintaining the data needed, and reviewing the collection of information. This is a voluntary collection of information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information subject to the requirements of the Paperwork Reduction Act of 1995, unless it displays a currently valid OMB control number. The OMB # is 0970-0568 and the expiration date is 04/30/2024.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMozer, Nick (ACF) (CTR)
File Modified0000-00-00
File Created2024-07-28

© 2024 OMB.report | Privacy Policy