CDAO Agile Adoption Feedback Questionnaire

Fast Track Generic Clearance for the Collection of Qualitative Feedback on Agency Service Delivery

0704-0553_CDAO Agile Adoption Questionnaire_Final

CDAO Agile Adoption Feedback Questionnaire

OMB: 0704-0553

Document [docx]
Download: docx | pdf

Shape4 Shape5

9/5/24, 10:43 AM

CDAO Agile Adoption Questionnaire



CDAO Agile Adoption Feedback Questionnaire


Thank you for your participation. You were identified by your team lead as a member of a data, analytics, or AI product team to complete this data call in support of the Data, Analytics, and Artificial Intelligence (AI) Adoption Strategy. The purpose of this data call is to identify strengths and weaknesses of current services and make improvements in service delivery based on feedback. The solicitation of feedback will target areas such as: timeliness, appropriateness, accuracy of information, courtesy, and efficiency of service.


On the next page, please enter your Unique Team Identifier located in the same email with the link to this data call questionnaire.


This data call should take no more than 15 minutes to complete. There are no “right” answers, so please answer the questions as accurately as you can.


Please be aware of the following terminology when responding to the questions:

  • Product and/or Service a data, analytics, or AI asset or platform, including reports, dashboards, datasets, AI models, algorithms, software applications, and supporting infrastructure.

  • Feature – a component of a product and/or service

  • Update – changes/improvements to a product and/or service

  • Agile software development approach that is characterized by regular releases of working software through iterative and incremental development cycles


OMB CONTROL NUMBER: 0704-0553 OMB EXPIRATION DATE: 05/31/2025


AGENCY DISCLOSURE NOTICE

The public reporting burden for this collection of information, 0704-0553, is estimated to average 15 min per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding the burden estimate or burden reduction suggestions to the Department of Defense, Washington Headquarters Services, at whs.mc- [email protected]. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number.






* Required


Your Information




  1. Enter your team ID from the email provided by your POC. *

Shape1



  1. What is the name of the data & analytics product and/or service team you support?

Shape2



  1. What is your role/title on the team? *

Shape3

  1. What product type does your team predominantly support? *


Data Dashboard or Visualization Dataset / Database

AI Solution


Algorithm/Predictive Model Data Management Platform AI/ML Platform

Analytics and BI Platform Infrastructure/Hosting

Software Application/Web Tools


Product Type Not Listed (Specify below) We support multiple products



  1. Specify product type your team supports

Shape8



  1. How many years have you been in your current role? *


Less than 1 year 1-2 years

2-5 years


5-10 years


More than 10 years

  1. How many years of overall experience do you have? *


Less than 1 year 1-2 years

2-5 years


5-10 years


More than 10 years




  1. How many years have you worked in the DoD? *


Less than 1 year 1-2 years

2-5 years


5-10 years


More than 10 years




  1. Which option best describes your status within the DoD? *


Civilian employee Military

Contractor


Other (Specify below)




  1. Specify your status within the DoD

Shape9

Speed




  1. How often do you incorporate user feedback during product and/or service design / pre- production? *


We do not incorporate user feedback during product/service design / pre-production. We rarely incorporate user feedback during product/service design / pre-production.

We sometimes incorporate user feedback during product/service design / pre-production. We often incorporate user feedback during product/service design / pre-production.

We always incorporate user feedback during product/service design / pre-production.




  1. What is the average time to deliver trial versions / prototypes of prospective products and/or services for iteration and/or user feedback? (Note: provide average time between initiation of the concept to delivery of the trial version / prototype.) *


We do not deliver trial versions / prototypes of products/services More than 6 months

Between 3 and 6 months


Between 1 and 3 months Less than 1 month



  1. What is the average time from concept initiation to wide release of a new product and/or service? *


More than 2 years


Between 1 and 2 years


Between 6 months and 1 year


Between 3 and 6 months Less than 3 months

  1. What is the average time to release an update to an existing product and/or service? *


We do not deliver product/service updates More than 6 months

Between 3 and 6 months


Between 1 and 3 months Less than 1 month



  1. How often do you use automation capabilities (e.g., for testing, deployment, etc.) and/or continuous integration and continuous deployment (CI/CD) during product and/or service development? *


Never (all activities are manual from build to release) Rarely

Sometimes Often

Always

Agility




  1. Which of the following best describes your team's approach to product and/or service management? *


Our team delivers projects focused on achieving a pre-determined scope in a given timeline


Our team delivers projects focused on achieving a pre-determined scope in a given timeline that are mostly aligned to business/mission objectives


Our team fluctuates between both project and product management; we define product value and plan incremental improvements, but still manage most tasks as discrete projects


Our team delivers products that are improved/enhanced regularly but often with timelines that are greater than 3 months


Our team delivers products that are continuously improved/enhanced with shorter timelines (less than 3 months), focused on achieving incremental value to the mission/business user for each update.




  1. Which of the following best describes your team's development operations (DevOps) ways of working? *


Development and operations are often the same staff and/or we do not have integrated ways of working between development and operations.


Handoffs occur between development and operations, with limited collaboration. Regular touchpoints occur between development and operations per project.

Operations is considered early in development, and revisited/consulted in cases of major incidents.


Developers and operations act as one team, with dedicated roles in place (e.g., site reliability engineering, DevOps).




  1. To what extent is the solution architecture or underlying infrastructure (including cloud and on-premise) optimized for agile/iterative software delivery? *


The architecture/infrastructure presents major challenges/roadblocks to agility.


The architecture/infrastructure has barriers to agility that can be overcome with effort.


The architecture/infrastructure is fully optimized and presents minimal barriers to agile delivery.

  1. To what extent has your project team adopted agile practices? *


Our team has not adopted agile practices


Our team is starting to explore agile approaches (e.g., shorter timelines for product/update release, iteration based on user needs/changing dynamics, managing a prioritized backlog, etc.) for specific processes


Agile approaches are used, but inconsistently adopted and/or constrained by internal policies/practices


Near-full adoption, with plans to remove remaining barriers to full adoption (operational models, policies/processes, career paths/skills)


Our team is fully agile




  1. Which of these scenarios best describes how decisions are made within your organization? *


All decisions that impact our team's work are centralized and hierarchical. Most decisions that impact our team's work are centralized and hierarchical.

Some decisions are centralized while others are decentralized at lower levels of the organization.


Most decisions are decentralized and guided by empowered leaders, with minimal (but still present) centralized decision-making.


Empowered leadership provide guidelines and desired outcomes with decentralized decision-making at the lowest level possible.




  1. To what degree does your team re-assess and re-prioritize new functionalities, features, and/or upgrades based on changing dynamics? *


Priorities are set upfront and not revisited.


Priorities are assessed and re-prioritized every 6 months or longer. Priorities are assessed and re-prioritized every 3 to 6 months.

Priorities are assessed and re-prioritized every 1 to 3 months.


Priorities are assessed and re-prioritized multiple times per month.

Learning




  1. To what extent are your teams comprised of individuals with diverse/varied roles and skillsets? *


Our team is primarily focused on one specific task or area of expertise.


Our team is primarily focused on one specific task or area of expertise, but we will work with other teams as needed to bring in additional skillsets.


Our team has a mix of roles, both specialized and cross-functional (i.e., work across different areas/skills). Our team is largely cross-functional (i.e., work across different areas/skills) with some specialized roles.

Our team is fully cross-functional (i.e., work across different areas/skills) where everyone has diverse skillsets.




  1. To what extent do you typically document lessons learned from successes and failures of your product/service development and apply it to your future work? *


We do not document lessons learned.


We document lessons learned for some tasks, but do not apply in our future work.


We document lessons learned at the end of every task or when a product/service/update is delivered, but inconsistently apply it to future work


We document lessons learned at touchpoints throughout every product/service/update delivery and usually apply it to future work


We document lessons learned as appropriate throughout every product/service/update delivery and continuously and consistently apply it to future work




  1. To what extent is user feedback for your products and/or services captured for however long the product/service exists and applied/adopted to future process, operations, and/or maintenance? *


Feedback from users is not captured for any product/service/update for however long the product exists. Feedback from users is inconsistently captured and not applied to future operations.

Feedback from users is usually captured but rarely applied to future operations.


Feedback from users is often captured but only sometimes applied to future operations.


Feedback from users is always captured for however long the product exists, and continuously applied to future operations.

  1. How do you manage and maintain a library of reusable assets (including code, features, models, templates, etc.)? *


No management of reusable assets


Informal management with limited documentation Managed library with basic documentation

Managed library with comprehensive documentation and usage guidelines


Managed and maintained library with standardized coding practices and regular updates

Responsibility




  1. Once identified, when can critical defects, issues, bugs, and/or problems be corrected in the product and/or service? *


Inconsistent and/or delayed response to address known issues Between 7-14 days

Between 3-7 days


Between 1-3 days


Within 1 day




  1. What is the primary type of metric used to track progress of projects? *


Metrics are not formally tracked


Only output metrics (i.e., Measures of Performance) are tracked such as work completed, development speed, test results, etc.


Predominantly output metrics are tracked, with a few outcomes-based, mission/business focused metrics (i.e.,

Measures of Effectiveness) such as detailed quality requirements, measures based on user and stakeholder feedback, etc.


A mix of both output and outcomes-based metrics are tracked.


Predominantly outcome-based, business/mission focused metrics (i.e., Measures of Effectiveness) are tracked.




  1. To what extent is compliance with applicable data governance processes tracked/monitored? Example data governance policies may be those related to data sharing and reuse, data standards, privacy, data quality, data collection ethics/bias, etc. *


Compliance with applicable data governance policies is never tracked/monitored


Compliance with applicable data governance policies is sometimes tracked/monitored Compliance with applicable data governance policies is often tracked/monitored

Data governance policies are always tracked, monitored, and addressed but often via manual processes (e.g.,

email)


Data governance policies are always tracked, monitored, and addressed via automated processes/tooling (e.g., data catalog, workflow tools, etc.) as applicable.

  1. How is the advisability of pursuing a proposed product/service/feature evaluated during the development and delivery lifecycle? Evaluation criteria can include ethical, safety, social/rights-impacting, reputational, and/or environmental impacts/risks. *

Note: "Advisability" refers to the appropriateness, wisdom, or suitability of deploying or using a particular product/service/feature in a given context. It involves assessing whether the implementation of the product/service/feature is a good idea given ethical considerations, risk and safety, and other impacts.


An advisability evaluation is not included in the development process.


Advisability is informally evaluated with stakeholders at the beginning of product/service/feature development.


Advisability evaluation is a part of the formal process to decide if a new product/feature/service will be pursued. However, there is no consistent evaluation criteria to assist with advisability determination.


Advisability evaluation is a part of the formal process to decide if a new product/service/feature will be pursued. Decisions are consistently documented based on defined evaluation criteria (e.g., ethical, safety, social/rights-

impacting, etc.)


Advisability is formally, consistently, and continuously reviewed across the product/service/feature development lifecycle, not just at the beginning. We actively manage/update the criteria used to evaluate advisability.




  1. To what extent has your team leveraged DOD CDAO guidance and resources to support responsible AI development? Example resources may include policy/guidance, CDAO AI T&E Resources, RAI Toolkit, Joint AI Test Infrastructure Capability (JATIC), etc. *


We have not yet used DoD CDAO guidance/resources to support AI solution development/management.


We have tested, experimented, and/or identified potentially useful CDAO AI guidance/resources but have not implemented them.


We have implemented CDAO AI guidance/resources but have not consistently used them. We consistently use CDAO AI guidance/resources in AI solution development/management.

We consistently use CDAO AI guidance/resources and have provided feedback for improvement.




  1. Please specify which resources used for responsible AI development

Shape10

Final Question




  1. Is there anything else you would like to share about your D&A product team?

Shape11





Shape12 This content is neither created nor endorsed by Microsoft. The data you submit will be sent to the form owner.

Microsoft Forms

Shape6 Shape7

https://forms.osi.apps.mil/Pages/DesignPageV2.aspx?prevorigin=shell&origin=NeoPortalPage&subpage=design&id=kQEtEK7uYUexyxqD6G70Rew…

1/12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleCDAO Agile Adoption Questionnaire
File Modified0000-00-00
File Created2025-05-29

© 2025 OMB.report | Privacy Policy