PIR Plan

BDMS PIR Plan_v1 0_20140807_Final_Signed 20140818.pdf

Fast Track Generic Clearance for the Collection of Qualitative Feedback on Agency Service Delivery

PIR Plan

OMB: 0704-0553

Document [pdf]
Download: pdf | pdf
Deployment and Readiness Systems
Program Management Office

Post-Implementation Review Plan for
Enterprise Blood Management System Increment 1,
Blood Donor Management System

Prepared by:
Force Health Protection and Readiness
Support Milestone: Fielding Decision
Version 1.0
July 2014

EBMS Inc 1 BDMS PIR Plan – Version 1.0

TABLE OF CONTENTS
DOCUMENT APPROVAL...........................................................................................................2
1

INTRODUCTION....................................................................................................................1
1.1
1.2
1.3
1.4
1.5
1.6

2

AREAS OF ASSESSMENT ....................................................................................................5
2.1
2.2
2.3

3

CUSTOMER SATISFACTION .................................................................................................6
MISSION/PROGRAM IMPACT ...............................................................................................6
RETURN ON INVESTMENT ...................................................................................................7

PLAN OF ACTION .................................................................................................................7
3.1
3.2
3.3
3.4
3.5
3.6

4

PURPOSE ............................................................................................................................1
BACKGROUND ....................................................................................................................2
PROGRAM SUMMARY .........................................................................................................3
PIR DESCRIPTION ...............................................................................................................4
RESOURCES ........................................................................................................................5
SCHEDULE ..........................................................................................................................5

SCHEDULE THE PIR ............................................................................................................7
ASSEMBLE A PIR TEAM .....................................................................................................7
ASSEMBLE AND REVIEW AVAILABLE INFORMATION SOURCES ..........................................8
CONDUCT THE PIR .............................................................................................................9
CONDUCT THE ANALYSIS ...................................................................................................9
PREPARE A REPORT AND PROVIDE RECOMMENDATIONS ....................................................9

NEXT PIR REVIEW DATES...............................................................................................10

APPENDIX A: DRAFT POST IMPLEMENTATION REVIEW REPORT FORMAT .....11
APPENDIX B: DRAFT USER SURVEY QUESTIONS.........................................................14
APPENDIX C: BDMS MEASURES OF EFFECTIVENESS, SUITABILITY, AND
SURVIVABILITY .................................................................................................................17
APPENDIX D: BDMS CRITICAL SUCCESS FACTORS ....................................................22
APPENDIX E: ACRONYMS ....................................................................................................26
APPENDIX F: REFERENCES .................................................................................................29

TABLES
Table 1: Program Events ..................................................................................................... 5
Table 2: Example User Satisfaction Ratings* .................................................................... 6

i

EBMS Inc 1 BDMS PIR Plan – Version 1.0

DOCUMENT APPROVAL

Enterprise Blood Management System Increment 1
Post Implementation Review Plan
Version 1.0
Signature Page
*****************************************************************
Submitted By:

ROGERS.YVETTE.
E.1061738351

Digitally signed by ROGERS.YVETTE.E.1061738351
DN: c=US, o=U.S. Government, ou=DoD, ou=PKI,
ou=DHA, cn=ROGERS.YVETTE.E.1061738351
Date: 2014.08.07 15:21:58 -04'00'

Yvette Rogers
Acting Director, Deployment Technologies Branch
Readiness Division
Healthcare Operations Directorate
======================================================
Concurred By:

UPDEGROVE.CHA
RLES.DELANCEY.1
025436764

Digitally signed by
UPDEGROVE.CHARLES.DELANCEY.1025436764
DN: c=US, o=U.S. Government, ou=DoD,
ou=PKI, ou=TMA,
cn=UPDEGROVE.CHARLES.DELANCEY.1025436
764
Date: 2014.08.07 18:58:12 -04'00'

Charles Updegrove
Program Manager
Deployment and Readiness Systems
======================================================
Approval:

SMITH.DAVID
.J.1085480975

Digitally signed by
SMITH.DAVID.J.1085480975
DN: c=US, o=U.S. Government,
ou=DoD, ou=PKI, ou=OSD,
cn=SMITH.DAVID.J.1085480975
Date: 2014.08.15 17:56:31 -04'00'

David J. Smith, M.D.
Deputy Assistant Secretary of Defense
Force Health Protection and Readiness

ii

EBMS Inc 1 BDMS PIR Plan – Version 1.0

ENTERPRISE BLOOD MANAGEMENT SYSTEM INCREMENT 1,
BLOOD DONOR MANAGEMENT SYSTEM
POST IMPLEMENTATION REVIEW PLAN
1

INTRODUCTION

The Enterprise Blood Management System (EBMS) is an Acquisition Category III program
consisting of two Increments: EBMS Increment 1, Blood Donor Management System (BDMS)
and EBMS Increment 2, Blood Management Blood Bank Transfusion Service (BMBB/TS). The
program management responsibility of EBMS falls under the Deployment and Readiness
Systems (D&RS) Program Management Office (PMO) under the executive management of the
Program Executive Officer (PEO)1, Defense Health Clinical Systems (DHCS). The Defense
Health Agency (DHA) Component Acquisition Executive (CAE) serves as the Milestone
Decision Authority (MDA).
Both EBMS Increments are needed to replace the legacy Defense Blood Standard System
(DBSS). EBMS Increment 1, BDMS will replace the donor portion of the Defense Blood
Standard System (DBSS) by providing the ability to manage donor registration, donor deferral,
blood products test results, inventory shipment, enterprise reporting, and donor application user
accreditation/ training. The scope includes implementation within the Blood Donor Centers
(BDCs) – Continental United States (CONUS) and Outside CONUS (OCONUS).

1.1 Purpose
The post implementation review (PIR) will report the degree to which doctrine, organization,
training, materiel, leadership and education, personnel, facilities, and policy changes have
achieved the established measures of effectiveness for the desired capability; evaluate systems to
ensure positive return on investment and decide whether continuation, modification, or
termination of the systems is necessary to meet mission requirements; and document lessons
learned from the PIR.
The purpose of this PIR plan is to support the EBMS Increment 1, BDMS Fielding Decision in
accordance with the Interim DoD Instruction 5000.02, Enclosure 11, and the Defense
Acquisition Guidebook (DAG). This document will provide the necessary framework, but the
actual evaluation is the responsibility of the designated functional sponsor. The outcome of the
PIR will be a detailed report that will:


Verify the established Measures of Effectiveness (MOEs) from the BDMS Test Plan



Delineate the differences between estimated and actual investment costs and the benefits and
possible ramifications for unplanned funding needs in the future

1

Defense Health Information Management Systems PMO no longer exists following reorganization approved by
Assistant Secretary of Defense Health Affairs TRICARE Management Activity Memorandum, Subject:
“Reorganization of the Joint Medical Information Systems Program Executive Office”, June 5, 2013.

1

EBMS Inc 1 BDMS PIR Plan – Version 1.0



List the investment, selection and control processes “lessons learned” that can be used as the
basis for management improvements



Determine whether the delivered product meets the business need and how shortfalls may be
mitigated

If there are subsequent releases, each release will require an update to the PIR.

1.2 Background
According to the DAG, the Government Performance and Results Act (GPRA)2 requires Federal
Agencies to compare actual program results with established performance objectives. In
addition, the Clinger Cohen Act (Title 40/CCA)3 requires that Federal Agencies ensure that
outcome-based performance measurements are prescribed for the Information Technology (IT)
to be acquired and that these performance measurements measure how well the IT supports the
programs of the Agency4.
This information requirement is referred to in the Interim DoD 5000.025 as a PIR. DoD
component will plan, conduct and document the required review for IT systems post Fielding
Decision. Specifically, the plan for conducting the PIR is due at the Fielding Decision.
The Office of Management and Budget (OMB) Circular A-1306 has prescribed specific PIR
performance measurements of how well the acquired IT supports Federal Agency programs and
the DAG provides details of the expected information (to comply with statute) for this PIR. The
procedures for measuring a PIR are listed below:


Conduct post-implementation reviews of information systems and information resource
management processes to validate estimated benefits and costs, and document effective
management practices for broader use.



Evaluate systems to ensure positive return on investment and decide whether continuation,
modification, or termination of the systems is necessary to meet business/agency mission
requirements.



Document lessons learned from the post-implementation reviews. Redesign oversight
mechanisms and performance levels to incorporate acquired knowledge.



Re-assess an investment's business case, technical compliance, and compliance against the
Economic Analysis (EA).



Update the EA and Automated Information Technology (AIT) capital planning processes as
needed.

The PIR will assess actual system performance against program expectations. The DAG
provides the guidance necessary for this review after Initial Operational Capability (IOC) and
2

Government Performance and Results Act (GPRA) Modernization Act of 2010

3

Section 11313 of Subtitle III of title 40 of the United States Code (formerly known as Division E of the ClingerCohen Act (CCA) (hereinafter referred to as "Title 40/CCA")
4

DAG (Defense Acquisition Guidebook, Chapter 7.9 - Post Implementation Review)

5

Interim DoD Instruction 5000.02, Table 2, Milestone and Phase Information Requirements

6

OMB Circular A-130, Chapter 8, “Policy”, section b.(1)(d)

2

EBMS Inc 1 BDMS PIR Plan – Version 1.0

after Full Deployment. The DAG further states that the review must verify the fielded system
meets or exceeds thresholds and objectives for cost, performance, and support parameters
approved at full-rate production. The PIR will be conducted following completion of Fielding
Decision to the end users7.
This PIR will be conducted by the functional sponsor’s designee, with support and cooperation
from the D&RS PMO as necessary. Working in conjunction with the stakeholders, the
functional sponsor shall select the parameters for evaluations based on their relevance to future
modifications or upgrades for performance, sustainability, and affordability improvements, or
when there is a high level of risk that a Critical Success Factor (CSF) will not be sustained over
the life of the system. The proposed format for the final evaluation is located in Appendix A of
this document. This format may be tailored to fit the findings of the PIR Team.
An appropriately conducted PIR will satisfy both GPRA and Title 40/CCA requirements for a
post deployment evaluation.

1.3 Program Summary
The aging Military Health System (MHS) legacy blood management system – the DBSS, has
been unable to accommodate new features and certify software for deployment in a timely
manner to meet user and regulatory demands. DBSS capabilities are not to standard, require
high maintenance costs due to Food and Drug Administration (FDA) certification, and need to be
implemented enterprise-wide. DBSS received a Denial of Authority To Operate (DATO) on
June 24, 2010 and is currently disconnected from all MHS/Service networks. The health of
MHS beneficiaries donating and receiving blood, blood components, and derivatives is at risk
due to the inability to adequately manage the aforementioned items throughout the continuum of
care.
The solution – the EBMS – is a strategic technology modernization project that will enhance the
DoD Blood Program capabilities for Blood Donor Management through the seamless integration
of blood products inventory management, transport, and availability. Per Acquisition Decision
Memorandum (ADM) dated July 9, 2013, EBMS is an Acquisition Category III (ACAT III)
program consisting of two increments: the BDMS (Increment 1) and the BMBB/TS (Increment
2). Both increments are needed to replace DBSS.
BDMS, an enterprise-wide automated information system (AIS) for the 23 DoD blood donor
facilities, is comprised of a group of Commercial Off the Shelf (COTS) products: LifeTrak®
and InSight®. LifeTrak®, developed by Mediware® Information Systems, Inc. is the core of the
BDMS solution. InSight® is utilized for enterprise performance monitoring and metrics and
KnowledgeTrak™ is the learning management system. BDMS will be hosted at the MHS
Enterprise Service Operations Center (MESOC) in San Antonio, Texas; and the BDMS system
fail-over/ Continuity of Operations (COOP) site is the MESOC in Aurora, CO. The central
LifeTrak® application is a web-based application that will be accessed via laptops or desktops at
local facilities. BDMS manages:


Collection processes and donor records



Testing and manufacturing of products

7

Government Performance and Results Act (GPRA) Modernization Act of 2010

3

EBMS Inc 1 BDMS PIR Plan – Version 1.0



Distribution for managing inventory



Enterprise Donor service metrics (enterprise reporting), down to a specific product

BDMS is a stand-alone system with no dependencies on other systems. Donor and deferral
information will be accessible enterprise-wide. BDMS laptops support mobile blood drives
using laptops capable of collecting data during the drives that synchronize upon connection with
the network. An automated donor-screening tool will screen donors, both military and civilian.
Blood label printing capability is supported in addition to the ability to create detailed inventory
and management reports. BDMS supports the ability to securely import and export shipping and
receiving data. BDMS file and table build out – the COTS Product MHS “tailoring” process –
has been completed by the vendor/Joint Configuration Working Group (using the Planned
Systems International, Incorporated (PSI) prototype environment) and endorsed by the D&RS
PMO/Services.
The LifeTrak® Central Server and web-based InSight® application both utilize the DHSS
Identity Authentication Service (iAS) to authenticate users using Single Sign On with a DoD
Common Access Card (CAC).
The Enterprise Integration Engine provides the ability to ingest data from external systems (i.e.
laboratory instrument data) and transfers it to the LifeTrak® database.
BDMS implementation spans MHS facilities located CONUS and OCONUS. BDMS will
conduct an Operational \Assessment (OA) at three Service sites prior to seeking a Fielding
Decision Q1FY16.
BDMS meets regulatory compliance at the time of implementation and can adapt to regulatory
changes within the regulatory compliance period. BDMS is subject to the exacting regulations
of the FDA and the standards of use by AABB (formerly the American Association of Blood
Banks). The FDA regulates the manufacturing, marketing, and use of blood establishment
computer systems (BECS). To obtain pre-market “510K clearance” from the FDA, BECS
products must obtain minimum levels of functionality, attain standardization, ensure a safe blood
product, and comply with federal law. The FDA has issued blood establishment licenses to the
Service Blood Programs of the Air Force (License – 610); Army (License – 611); and Navy
(License – 635). All military blood facilities are registered by the FDA and accredited by
AABB; they must operate according to Title 21, Code of Federal Regulations, Part 200 Series,
Drug Current Good Manufacturing Practices, Part 600 series, Biologics, and Part 800 series,
Medical Devices.

1.4 PIR Description
The PIR should be carried out according to the PIR plan that will be reviewed and approved at
Fielding Decision. Care should be given to ensuring that accurate raw data is captured, so it can
be later used for analysis. In accordance with the PIR plan, the PIR will address:


Business/Customer Satisfaction: Address whether the user is satisfied with the IT
investment and determine if the investment meets their needs.



Mission/Program Impact: Address if the implemented system achieves its intended impact.
A comparison is conducted of the expectations contained in the original Business Case versus
any subsequent release.
4

EBMS Inc 1 BDMS PIR Plan – Version 1.0





Solicit Feedback: The most important measures of the success of a project are
whether the product was developed and delivered successfully and how well the needs
of the customers have been met. The most effective way to determine these measures
is to solicit feedback.



Conduct Project Assessment: The goal of this task is for the functional proponent
representative to meet with select members of the project team and stakeholder
community to present the summarized results of the feedback surveys, discuss all other
aspects of the completed project, gain consensus on what was successful and what was
not, and derive best practices and lessons learned.



Prepare Post Implementation Report: After the assessment, the Project Manager
prepares a Post Implementation Report. In the report, the Project Manager distills
information gleaned from the discussion and organizes it according to the feedback
categories described, adding information on key project metrics. The report
documents the effectiveness of the product in meeting the needs of the Customer.

Return On Investment Calculations: Compare actual project costs, benefits, risks, and
return information against earlier projections. Determine the causes of any differences
between planned and actual results.

1.5 Resources
The BDMS PIR will be conducted by the personnel already assigned to the initiative.
Additionally, the PIR will not incur any additional costs for travel or facilities if required. There
will be no compensation associated with the survey or respondents. The survey tool will be
supplied by the Program Office for the analysis of feedback results.

1.6 Schedule
Table 1: Program Events
END DATE

EVENT

Q2FY15

BECS Validation/Early Assessment (EA)

Q1FY15

Acquisition Decision Memorandum/Milestone C

Q4FY15

Operational Assessment (OA)

Q1FY16

Fielding Decision

Q4FY16

Full Deployment (FD)

2

AREAS OF ASSESSMENT

In essence, the PIR report is a summation of the successes and challenges of the BDMS program.
The assessment of success supports future decision-making, while the assessment of challenges
can be used to prevent recurrence of problems. Future deployments of BDMS can benefit from
this knowledge, with the potential to save time, decrease cost, improve system performance, and
improve organizational processes. The PIR team will utilize an online, government-procured
Survey Monkey account, which will result in a decreased burden on the respondent and the
collectors as compared to a paper-based survey. Survey responses will be stored in the D&RS
5

EBMS Inc 1 BDMS PIR Plan – Version 1.0

PMO-managed secured Survey Monkey account, and data will be taken offline and stored on a
secure DHA network. There will be no hard copies of raw survey data. Refer to Appendix B for
the proposed BDMS user satisfaction survey questions.

2.1

Customer Satisfaction

Customer satisfaction information will be acquired from BDMS end users located in various
settings. All survey respondents will receive the survey electronically. The results will be
captured, consolidated and analyzed by the D&RS PMO. D&RS PMO will develop a report of
findings to be included in the PIR report and distribute to the product stakeholders for review.
The responses received from BDMS end users will be anonymous and used to aid future
program-level decision making.
The survey is designed to gather demographic and role-specific data on the users of BDMS and
their satisfaction with the system itself. The survey collects data on six satisfaction variables
which will be used to categorize the results received from the end-users, these variables include:


System Speed



System Reliability



System Availability



BDMS Training



Overall Ease of Use



Overall Rating of BDMS

Each of the satisfaction variables listed above will provide the Program Office with data to
conduct a comprehensive review. The following is an example of a User Satisfaction Ratings
Table that will be used to develop metrics associated to the variables listed above:
Table 2: Example User Satisfaction Ratings*
Review Group

Far Below
Expectations

Below
Expectations

Meets
Expectations

Above
Expectations

Far Above
Expectations

BDMS

15

16

67

8

4

Documentation
Groups

12

17

65

7

3

*Numbers shown in table above denote responses that agree with that category heading

Based on user rating / user feedback, the BDMS team will make an effort towards tailoring
further development to alleviating pain points, or other possible enhancement areas.

2.2 Mission/Program Impact
The Top-Level Evaluation Framework matrix, in Appendix C, shows the correlation between
decisions, the primary capabilities, test methodologies, and other key test measures. The primary
test event is the operational test and documentation reviews used to support the BDMS Fielding
Decision.
6

EBMS Inc 1 BDMS PIR Plan – Version 1.0

The test and evaluation community uses MOEs and Measures of Suitability (MOSs) to provide
feedback to the functional community and to the EBMS Project Office on the completeness and
coverage of the requirements necessary to support the T&E of the system under test. MOEs
measure the mission accomplishment that comes from the use of the system under test and all
interrelated systems. Similarly, MOSs measure an item’s ability to be supported in its intended
operational environment. MOSs typically relate to readiness or operational availability,
reliability, maintainability, and the support structure.
Note: When the BDMS PIR Plan is executed, OA results and associated user feedback will
be available to PIR members for reference purposes.

2.3 Return on Investment
The costs and benefits of BDMS will be examined against the economic analysis present in the
BDMS business case. Operational benefits, which reflect non-financial improvements to
mission and administrative processes, will also be examined. Variance from the estimates in
actual program costs and benefits data may lead to a reassessment of the BDMS economic
analysis.

3

PLAN OF ACTION

3.1 Schedule the PIR
The PIR should take place once the operating environment has been established and stabilized.
The typical timeframe is 6 to 12 months after BDMS Full Deployment. BDMS Full Deployment
is scheduled for Q4FY16. The PMO and deployment team will field, train, and sustain the
software at all designated locations. The PIR schedule should be reviewed to determine planned
versus actual completion dates.

3.2 Assemble a PIR Team
PIR Teams should be comprised of individuals not directly involved in the acquisition. This PIR
Team will be established as a Working-level Integrated Product Team (WIPT) within the
guidelines of the BDMS Integrated Product Team (IPT) with the same voting membership.
The Team should include the following representatives:


Functional experts with detailed knowledge of the capability or business area and its
processes



User representatives, including Combatant Command users



Services



Chief Information Officer (CIO) representative



Functional Sponsors



Domain Owners



Joint Staff



Test and Evaluation members



Program Offices
7

EBMS Inc 1 BDMS PIR Plan – Version 1.0



Infrastructure

3.3 Assemble and Review Available Information Sources
Sources to consider are:


Economic calculations to establish the payback period and Return on Investment (ROI) of
business systems



Qualitative assessments related to expected benefits



Information Assurance assessments



Annual Chief Financial Officer (CFO) Reporting of IT investment measured performance



Stakeholder satisfaction surveys



Operational Test Event reports

Factors to be considered include:
Customer/User Satisfaction:


Partnership/involvement



Business process support



Investment performance



Usage

Strategic Impact and Effectiveness:


System impact and effectiveness



Alignment with mission goals



Portfolio analysis and management



Cost savings

Internal Business:


Project performance



Infrastructure availability



Standards and compliance



Maintenance



Evaluations (accuracy, timeliness, program quality, information adequacy)



Employee satisfaction/retention

Innovation:


Workforce competency



Advanced technology use



Methodology expertise
8

EBMS Inc 1 BDMS PIR Plan – Version 1.0

To ensure that each asset is evaluated consistently, the functional sponsor should have a
documented methodology for conducting these reviews. The methodology chosen must be in
alignment with the program offices. The program office should determine whether there may be
better cost, benefit, and risk measures that could be established that would improve the
monitoring of future projects. In addition, a mechanism should also be in place that takes the
lessons learned through the PIR and uses the lessons to update the Planning and Budgeting Phase
decision criteria as well as the Acquisition Process.

3.4 Conduct the PIR
A project is considered complete when it has been successfully implemented and transitioned to
the performing organization and approved by the Project Sponsor. At this point in the project
management lifecycle, the responsibilities of the Project Manager are to assess how closely the
project met Customer needs, highlight what worked well, learn from mistakes made during the
project, identify patterns and trends, derive ways to improve upon processes executed throughout
the project, and, most importantly, communicate results. The purpose of the PIR is to gather the
information required to meet those responsibilities, and to present the information in a PIR
report.

3.5 Conduct the Analysis
The analysis portion of the PIR should answer the questions, “Did we get what we needed?”
This provides a contrast to the test and evaluation measurements of MOEs which answer the
question, “Did we get what we asked for?” This would imply that the PIR should assess the extent
to which the DoD's investment decision-making processes were able to capture the user’s initial
intent. The PIR should also address whether the user’s needs changed during the time the system
was being acquired. The outputs of the analysis become the PIR findings. The findings should
clearly identify the extent to which the user received what they needed.

3.6 Prepare a Report and Provide Recommendations
Once PIR results have been consolidated, the PIR Team will prepare a report and make
recommendations that can be leveraged to mature the capabilities and business needs processes.
The primary recipient of the PIR report should be the Sponsor/Domain Owner who is responsible
for articulating the original objectives and outcome-based performance measures on which the
program or investment was based. The results of the PIR can aid in refining requirements for
subsequent increments. Recommendations may be made to correct errors, improve user
satisfaction, or improve system performance to better match user/business needs. The PIR Team
will also determine whether different or more appropriate outcome-based performance measures
can be developed to enhance the assessment of future spirals or similar IT investment projects.
This review will look at the strategic impact and effectiveness of the system and address whether
the system is in alignment with the mission and goals as outlined in the requirements
documentation. The high-level functional requirements include a list of the CSFs along with a
Requirements Traceability Matrix. Refer to Appendix D for a list of BDMS CSFs. A thorough
review of these areas will help determine the impact of the deployed system. The Team will
evaluate these requirements to see how successful the program has been at meeting the
thresholds and objectives. The Team will also develop a requirements review process and use it
to:


Demonstrate achievements against the projected costs, benefits, and timeliness
9

EBMS Inc 1 BDMS PIR Plan – Version 1.0



Isolate areas that do not meet required standards of performance and provide
recommendations for corrective actions, based on CSFs and other customer feedback



Identify opportunities to enhance the system



Identify program strengths and weaknesses for future reference and corrective action



Provide lessons learned to help in developing future systems/programs

Factors to be evaluated might include qualitative benefits, quantitative benefits, system
performance, and schedule benefits such as:


Improved facility management of personnel/workload



Enhanced health and fitness of the force



Improved inventory management



Reduction in duplicative efforts

At a minimum, system performance will be evaluated against the Joint and Service Concepts of
Employment/Operations relative to acceptable thresholds for data synchronization to determine
how effectively BDMS supports the needs of users. The final PIR report will be produced once
all appropriate data have been collected and analyzed. As the domain owner, the Deputy
Assistant Secretary of Defense for Health Affairs, Force Health Protection and Readiness, will
receive the final PIR report from the PIR team. A copy will also be provided to the DHCS PEO,
as well as the lead Operational Test Agency and the DHA Defense Health Cost Analysis and
Program Evaluation.

4

NEXT PIR REVIEW DATES

Supplementary PIRs may be required if there are subsequent BDMS releases. (Presently no
additional releases are scheduled). Each new release will require an update to the PIR.

10

EBMS Inc 1 BDMS PIR Plan – Version 1.0

APPENDIX A: DRAFT POST IMPLEMENTATION REVIEW
REPORT FORMAT
1.

Executive Summary
The executive summary should reference the major findings and
recommendations of the review.

2.

Background
Provide a brief description of BDMS and the circumstances leading to
implementation.

3.

Methodology
Describe the approach used to conduct the review, interviews, team members,
duration of the review, survey instruments, etc.

4.

Review Findings
Each item identified in the methodology section should be included in the review
of findings. The following areas should be investigated individually and as a
group:
Program Management:



Discuss the project management approach used. Identify positive and negative
aspects of that approach. Determine ways to enhance or change the approach for
future use on this program and other Military Health Service IT programs.



Compare the functionality to be delivered to what was actually delivered. Assess
user perceptions of the value/worth of the functionality implemented. All
exceptions and/or differences should be highlighted and the impact of the
omitted/added functionality explained.



Compare the actual timetable for BDMS against the approved timeline. Reasons
for any differences should be explained. Evaluate the effect of any changes to the
planned development/implementation.



Compare the benefits accrued to date with the benefits expected to be accrued as
stated in applicable acquisition documentation. A statement is required on the
expected achievement of any outstanding benefits. Reasons for any differences
should be explained. At a minimum, CSFs and MOEs benefits should be
measured.



The Event Design Plan (EDP) for the Operational Assessment of BDMS will
serve as the source document MOEs.



Describe the implementation and training component, noting strategies,
difficulties, deficiencies, and eventual success or failure.



Address program audit issues. Describe existing controls and security measures
and assess their adequacy.

11

EBMS Inc 1 BDMS PIR Plan – Version 1.0

Benefits:


Determine the impact of the deployed system.



Review CSFs and compare them to the fielded system.
Cost of Maintenance and Development:



Compare the actual project costs against the estimated costs in the Business Case.
Reasons for any differences should be explained.
Cybersecurity:



D&RS PMO/DHA Infrastructure and Operations are responsible for monitoring
the security for the BDMS program. During the PIR, the D&RS PMO security
point of contact will be responsible for providing the Team with an evaluation of
all applicable Cybersecurity artifacts.
System Interfaces:



Software metrics – The goal is to track, analyze, forecast, and thereby improve the
present and future software development process and its associated standards
taking into account that BDMS is a COTS product and a Medical Device thus
limiting software development to change requests submitted to the medical device
manufacturer. Measure performance against requirements at all levels of BDMS
infrastructure.



Availability – Measure the mean time between: failure, downtime, and
maintenance.



Software Integration Lessons Learned - The goal is to capture modifications to
integration effort and process for present and future software integration and its
associated standards. Collect and record major lessons learned throughout the
deployment process and disseminate appropriately.
User Satisfaction:



Discuss survey techniques and instruments used to determine user satisfaction.



Explain the results of user service surveys. Identify deficiencies and develop a
course of action to support recommendations.



Determine usage rates.



Evaluate training and help desk support.

5.

Identify Lessons Learned

Lessons learned should include, but are not limited to:


The project management process



The systems development process



The contracting methodology used



The training received/provided



The technology that was used
12

EBMS Inc 1 BDMS PIR Plan – Version 1.0



6.

The software that was used
Recommendations

Document the recommendations resulting from the PIR and the action plans to
implement the recommendations. All recommendations must be prioritized and it is
important to evaluate each recommended change as to their impact on all areas of
BDMS. Costs and benefits related to implementing the recommendations should be
included. The completed report will be coordinated among the stakeholders prior to
submission to the domain owner.

13

EBMS Inc 1 BDMS PIR Plan – Version 1.0

APPENDIX B: DRAFT USER SURVEY QUESTIONS
Privacy Advisory
The information collected from you in this survey is completely voluntary and will be used to
evaluate the Enterprise Blood Management System Increment 1, Blood Donor Management
System (BDMS) end user satisfaction and system usability. Future BDMS deployments can
benefit from this knowledge, with the potential to save time, decrease cost, and improve system
performance. Neither the Department of Defense (DoD) nor Deployment and Readiness Systems
Program Management Office, under the executive management of the Program Executive Office
Defense Health Clinical Systems, will collect personal information that can be used to identify you
when you visit this Web site. If, for some reason, you supply us with personal information, it will
be treated as confidential. No Internet Protocol addresses, cookies, browser data, operating system
information, or the number of bytes sent and received by your computer will be collected or
stored. Therefore our organization will not be able to link any survey response data to your
computer. It will reside in a data collection database. The results may be shared with DoD
Components for the use of validating and improving end user satisfaction and system usability.
None of this information will be revealed publicly or used to identify you.
1. Are you a Contractor?
• Yes – [survey will end at this point]
• No
2. How frequently do you use BDMS?
• I have never used BDMS – [survey will end at this point]
• I no longer use BDMS (former user who has stopped using the system)
• Infrequent user (does not use the system everyday)
• Frequent user (usually uses the system a few times every day)
• Very frequent user (consistently uses the system throughout the day)
3. How long have you been using BDMS?
• Less than 3 months
• 3 to 6 months
• 6 to 12 months
• 1 or more years

14

EBMS Inc 1 BDMS PIR Plan – Version 1.0

4. What is the name of the facility where you work?
• Camp Lejeune, NC
• Fort Benning, GA
• Fort Bliss, TX
• Fort Bragg, NC
• Fort Gordon, GA
• Fort Hood, TX
• Fort Leonardwood, MO
• Fort Sam Houston, TX
• Great Lakes, IL
• Joint Base Lewis-McChord
• Keesler AFB, MS
• Lackland AFB, TX
• Landstuhl, Germany
• McGuire AFB, NJ ASWBPL - East
• Naval Hospital Guam, Guam
• Okinawa, Japan
• Pentagon, VA
• Portsmouth, VA
• San Diego NMC, CA
• Travis AFB, CA ASWBPL – West
• Tripler, Hawaii
• WRNMMC, MD
• Wright Patterson AFB, OH
5. What branch of service do you belong to or support?
• Air Force
• Army
• Marine Corps
• Navy
• Other:
6. Which of the following describes your PRIMARY functional area at this facility?
• Blood Donor Center Operations
• Laboratory (Unit Testing)
• Distribution (Shipping/Receiving)
• Other Role:
7. How satisfied are you with: (Scale: Very Satisfied/ Satisfied/ Neither Satisfied nor
Dissatisfied/ Dissatisfied/ Very Dissatisfied)
• System speed
• System reliability
• System availability
• Application connectivity
• Helpdesk process
15

EBMS Inc 1 BDMS PIR Plan – Version 1.0

•
•
•
•

BDMS training
BDMS training materials
Overall ease of using the system
Overall rating of BDMS

8. Has your organization changed its workflow or business processes to make it easier
for you to use BDMS?
• Yes
• No
• Comments (if yes, please explain):
9. How satisfied are you with the following BDMS functions: (Scale: Very Satisfied/
Satisfied/ Neither Satisfied nor Dissatisfied/ Dissatisfied/ Very Dissatisfied)
• Donor Registration
• Recording Donor Health History Responses/Physical Findings
• Managing Donors – Donor Merge, Donor Interdictions
• Shipping products
• Recording donor comments
• Inventory Management
• Testing
• Manufacturing/Modifying Products
• Product Labeling
• Product QC Functions
• Comments:
10. Comments:

16

EBMS Inc 1 BDMS PIR Plan – Version 1.0

APPENDIX C: BDMS MEASURES OF EFFECTIVENESS, SUITABILITY,
AND SURVIVABILITY
The Army Test and Evaluation Command (ATEC) will conduct an operational assessment (OA).
The OA is a field test of a system or item to examine its operational effectiveness, suitability, and
survivability. OA is conducted under realistic operational conditions with users who represent
those expected to operate and maintain the system when it is fielded or deployed. An OA is
conducted using production or production representative units.
The system is assessed for overall system effectiveness, suitability, and survivability utilizing a
framework of ten critical operational issues (COI). A COI is a key operational effectiveness,
suitability, or survivability issue that must be evaluated to determine the system's capability to
perform its mission.
A COI is normally phrased as a question that must be answered in order to properly evaluate
operational effectiveness, suitability, and survivability.
Critical Operational
Issue (COI) / Criteria
COI 1. Business Process
Support
Criterion 1. Does BDMS
support the business
process in a timely and
accurate manner?

COI 2. Interoperability
Criterion 2. Does BDMS
support the Net-Ready
CSF requirements?

Measures

Threshold

MOE 1-1. Percent of
Essential Business
Functions (EBF)
successfully completed to
support the User’s Business
Process

99.5% for EBFs linked
to High Level Business
Outcomes

MOE 1-2. Percent of users
indicating they were able to
successfully complete their
Business Process Support
EBFs

70% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOE.

MOE 2-1. Interoperability
Assessment of Net-Ready
CSF

Must operate on each
Service’s infrastructure
and must fully
demonstrate that the
critical system data
exchanges can be
accomplished to support
military operations in
net-centric operations.
System can be installed,
configured, and
managed on each
Service’s platforms and
communications
infrastructure to support
its net-centric military
operations.

17

85% for all other EBFs

Test Methodologies/
Key Resources
Primary:
OA Scenario Execution
Secondary:
EA and BECS
Scenario Execution;
Functional SIT
Primary:
OA User Opinion Surveys
Secondary:
EA and BECS
User Opinion Surveys
Primary:
OA JITC Over-the-shoulder
Observations and End User
Surveys/Interviews
Secondary:
SIT and EA
JITC Over-the-shoulder
Observations and
Database migration
verification

EBMS Inc 1 BDMS PIR Plan – Version 1.0

Critical Operational
Issue (COI) / Criteria

COI 3. Database
Management

Measures

Threshold
EBMS components
must demonstrate the
end-to-end information
exchange requirements
with its critical and
external systems/
applications/ interfaces,
as defined in the
Business Case.
Pass/Fail

MOE 3-1. Database
Migration

Test Methodologies/
Key Resources

Database Migration SIT

Criterion 3. Is data
available in a timely,
complete, and accurate
manner?
MOE 3-2. High Level
Outcome Data
Completeness

99.5%

Primary:
OA scenarios execution
database queries and
verification
SIT database migration
queries and verification

Secondary:
EA scenario execution
database query/verification
MOE 3-3: High Level
Outcome Data Accuracy

99.9%

Primary:
OA scenarios execution
database queries and
verification
SIT database migration
queries and verification

Secondary:
EA scenario execution
database query/verification
MOE 3-4. Accessibility
Query CSFs, database
timeliness and load

15 seconds for up to 100
requests per second for
both system and
network

Primary:
Capacity Analysis
OA Instrumentation
Secondary:
EA instrumentation

18

EBMS Inc 1 BDMS PIR Plan – Version 1.0

Critical Operational
Issue (COI) / Criteria
COI 4. Network System
Management

Measures

Threshold

MOE 4-1. Accessibility
Query CSFs, network
timeliness and load

15 seconds for up to 100
requests per second for
both system and
network

Criterion 4. Mission
accomplished by managing
and utilizing intended
network infrastructure.
COI 5. Training
Criterion 5. Does BDMS
training prepare users to
operate the system as
expected?

COI 6. User Friendliness
Criterion 6. Does BDMS
provide features and
characteristics that enable
users to operate the system
in a timely and accurate
manner?
COI 7. Supportability
Criterion 7. Does BDMS
provide the capability to
support users in
accomplishing their
mission by insuring a
reliable, available, and
maintainable system?

Test Methodologies/
Key Resources
Primary:
Capacity Analysis
OA Instrumentation
Secondary:
EA instrumentation

MOS 5-1. Percent of users
indicating through the UOS
that the training prepared
them to operate the system
in a timely and accurate
manner.
MOS 5-2. Percent of users
indicating through the UOS
that training documentation
is adequate to support task
completion and
deployment.
MOS 5-3. Percent of users
indicating through the UOS
that formal and informal
change management efforts
facilitated an efficient
transition from the legacy
system to the BDMS
system.
MOS 5-4. Formal training
(based on the new SOPs)
must be developed and
provided to each site.
MOE 5-5. SOPs are FDA
and AABB compliant.

80% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOS

Primary:
OA User Opinion Surveys

80% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOS

Primary:
OA User Opinion Surveys

80% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOS

Primary:
OA User Opinion Surveys

Pass/Fail

AMEDDC&S Training
Readiness Statement

Pass/Fail

MOS 6-1. BDMS’ data
entry, data displays,
interactive controls, and
error management functions
are adequate and easy to
use to facilitate mission
performance in a timely and
accurate manner.
MOS 7-1. There are
adequate manpower and
personnel to support users
in documenting and
tracking issues so as to
facilitate issue resolution in
a timely manner.

80% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOS.

Primary:
BECS validation results.
Secondary:
OA scenario execution
Primary:
OA User Opinion Surveys

MOS 7-2. BDMS must

19

Secondary:
EA User Opinion Surveys

Secondary:
EA User Opinion Surveys

Secondary:
EA User Opinion Surveys

Secondary:
EA User Opinion Surveys

80% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOS.

Primary:
OA User Opinion Surveys
Life Cycle Sustainment Plan
SME review

Staffing levels in
accordance with Life
Cycle Sustainment Plan
MTBx where x are the

Secondary:
EA User Opinion Surveys
Primary:

EBMS Inc 1 BDMS PIR Plan – Version 1.0

Critical Operational
Issue (COI) / Criteria

Measures

Threshold

provide a reliable system to
support the Users in
accomplishing their mission
in a timely and accurate
manner.

reliability failure
categories in the Failure
Definition/Scoring
Criteria (FD/SC)

MOS 7-3. BDMS is
consistently available to
support the users in
accomplishing their mission
in a timely and accurate
manner

MOS 7-4. Percent of Users
indicating that the BDMS
online help text, data field
names and, error messages
and icons help Users to
enter data into BDMS
Documentation Tool.
MOS 7-5. Percent of Users
indicating that the BDMS
User manual and/or quick
reference guides are
adequate to assist in
resolving questions
concerning how BDMS
usage.
MOS 7-6. The Percent of
Users indicating help desk
provides adequate service
to enable issue resolution.
MOS 7-7. BDMS shall be

99% Operational
Availability (Ao)
This requirement
applies to enterprise
instances of this system
as well as client
systems. Ao is
calculated from the
formula Up time/Total
Time or Up Time/ (Up
Time + Downtime) =
MTBSA /
(MTBSA+MTTR +
ALDT).
MTBSA: Mean Time
Before System Abort
MTTR: Mean Time To
Repair
ALDT: Average
Logistics Delay Time
80% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOS

Secondary:
OA Scenario Execution data
and failure information (help
desk tickets and/or test
incident reports)
Primary:
OA Scenario Execution data
and failure information (help
desk tickets and/or test
incident reports)

Secondary:
OA Scenario Execution data
and failure information (help
desk tickets and/or test
incident reports)

Primary:
OA User Opinion Surveys

Secondary:
EA User Opinion Surveys
80% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOS

Primary:
OA User Opinion Surveys

Secondary:
EA User Opinion Surveys

80% or greater of
surveyed users indicate
through the UOS that
BDMS meets MOS
Successful development

20

Test Methodologies/
Key Resources
OA Scenario Execution data
and failure information (help
desk tickets and/or test
incident reports)

Primary:
OA User Opinion Surveys
Secondary:
EA User Opinion Surveys
Life Cycle Sustainment Plan

EBMS Inc 1 BDMS PIR Plan – Version 1.0

Critical Operational
Issue (COI) / Criteria

COI 8. Cyber Security
Criterion 8: Does BDMS
comply with the MHS’s
comprehensive security
program (see DODI
8510.01) and have
processes and procedures
to prevent unauthorized
individuals from
degrading, manipulating,
or interrupting system
performance or data
availability?
COI 9. Continuity of
Operations
Criterion 9: Are BDMS's
COOP features and
capabilities, along with
user practices and
processes, adequate to
sustain the system as
required for the mission,
including backup,
restoration, archiving, and
scheduled shut down for
maintenance or
movement?

Measures

Threshold

compliant with a lifecycle
sustainment plan
MOS 7-8. Number,
severity, and response times
of help desk tickets.

of a lifecycle
sustainment plan
In accordance with
service level agreement
in BDMS Life Cycle
Sustainment Plan

MOS 8-1. All required
security certifications and
accreditations verified in
accordance with DODI
8510.01
MOS 8-2. The system has
controls to prevent
unauthorized individuals
from degrading,
manipulating, or
interrupting system
performance or data
availability

Successful issuance of
an Authority to Operate
(ATO) or Interim
Authority to Operate
(IATO).
No high risk (Category
1 or 2) vulnerabilities in
the Plan of Actions and
Milestones (POA&M).

MOS 9-1. COOP features,
capabilities, practices, and
processes are adequate to
sustain the system.

An adequate Continuity
of Operations Plan must
exist at both the
enterprise and MTF
levels.
Successful
demonstration of
alternate site
functionality.

21

Test Methodologies/
Key Resources
Completion
Primary:
OA Scenario Execution data
and failure information (help
desk tickets and/or test
incident reports)
Secondary:
OA Scenario Execution data
and failure information (help
desk tickets and/or test
incident reports)
Signed ATO or IATO

Copy of the DIACAP
package containing
DIACAP Scorecard, Plan of
Actions and Milestones
(POA&M), signed ATO or
IATO, and DHA IA led
security test and evaluation
reports
COOP plans and interviews
with appropriate system
administrators.

Demonstration of alternate
site functionality.

EBMS Inc 1 BDMS PIR Plan – Version 1.0

APPENDIX D: BDMS CRITICAL SUCCESS FACTORS
Requirement
Critical Success Factors
Net-Ready (NR)

Fully support execution of
joint critical operational
activities and information
exchanges identified in the
DoD Enterprise as well as be
IAW solution architectures
based on integrated
Department of Defense
Architecture Framework
(DoDAF) content, and must
satisfy the technical
requirements for transition to
Net-Centric operations.

Production Threshold









Production Objective

1) Solution architecture [T=O].
products compliant with
DoD Enterprise
Architecture based on an
integrated DoDAF
content, including
specified operationally
effective information
exchanges.
2) Compliant with Net Centric Data Strategy
and Net-Centric Services
Strategy, and the
principles and rules
identified in the DoD
Information Enterprise
Architecture (DoD IEA).
3) Compliant with GIG
Technical Guidance to
include the
implementation
guidance of GIG
Enterprise Service
Profiles (GESPs)
necessary to meet all
operational requirements
specified in the DoD
Enterprise Architecture.
4) Information assurance
requirements including
availability, integrity,
authentication,
confidentiality, and nonrepudiation, and
issuance of an Interim
22

EBMS Inc 1 BDMS PIR Plan – Version 1.0

Authority to Operate
(IATO) or Authority To
Operate (ATO) by the
Designated Accrediting
Authority (DAA).

System Operational
Availability

Must provide a 99%
Must provide a 100%
Operational Availability (Ao). operational availability
This requirement applies to
enterprise instances of this
system as well as client
systems. Ao is calculated from
the formula Up time/Total Time
or Up Time/ (Up Time +
Downtime) = MTBSA /
(MTBSA+MTTR + ALDT).

MTBSA: Mean Time Before
System Abort
MTTR: Mean Time To Repair
ALDT: Average Logistics
Delay Time

System Operational
Availability

Must provide a 99%
Must provide a 100%
Operational Availability (Ao). operational availability
This requirement applies to
enterprise instances of this
system as well as client
systems. Ao is calculated from
the formula Up time/Total Time
or Up Time/ (Up Time +
Downtime) = MTBSA /
(MTBSA+MTTR + ALDT).

MTBSA: Mean Time Before
23

EBMS Inc 1 BDMS PIR Plan – Version 1.0

System Abort
MTTR: Mean Time To Repair
ALDT: Average Logistics
Delay Time

Accessibility

NIPRNET Access

BDMS when connected to the Per hour - 100 users.
network instances, the
following information access
capabilities are visible &
understandable to authorized
users:


NIPRNET Access
Per hour - 350 users.

1. Donor Management
 1. Donor Management  1. Donor
- management of
Management Donor information as
it relates to donations, Query Response Time
Query Response Time
test results and Donor
data to include:
 Content - 15seconds for
 Content - 5
 Donor consolidated
up to 100 requests per
seconds for up to
donation history
second
350 requests per
 Alerts for unsuitable
second
donors
 2. Blood/Blood
Product Management tracking,
documentation &
management of
blood/blood products
 2. Blood/Blood
 2. Blood/Blood Product
to include:
Product
Management  Manufacturing data
Management  Accurate Barcode
labeling
Query Response Time
Query Response Time
 Blood Component
Information
 Content - 5
 Content -15 seconds for
seconds for up to
up to 100 requests per
IAW FDA & AABB
350 requests per
second
regulations.
second
24

EBMS Inc 1 BDMS PIR Plan – Version 1.0











3. Inventory
3. Inventory Management Management - track
data associated with
storage, disposition & Query Response Time
shipment of blood
products to include:
 Content - 15 seconds
Blood/Blood Product
for up to 100 requests
traceability from
per second
Donation to
destruction
Donation
Identification Number
(DIN)
Potential blood
product needs for the
enterprise
4. Look Back 4. Look Back retrieval capability for
Donor Services from Query Response Time
input to final
disposition in
 Content - 15 seconds
compliance with
for up to 100 requests
regulatory guidelines:
per second

25

3. Inventory Management
Query Response Time


Content - 5
seconds for up to
350 requests per
second

4. Look Back Query Response Time


Content - 5
seconds for up to
350 requests per
second

EBMS Inc 1 BDMS PIR Plan – Version 1.0

APPENDIX E: ACRONYMS
Acronym

Definition

Ao

Operational Availability

AABB

formerly the American Association of Blood Banks

ACAT

Acquisition Category

AIT

Automated Information Technology

ALDT

Average Logistics Delay Time

AMEDD
C&S

Army Medical Department Board Center and School

APBC

Automated Patient Backup Card

ATO

Authority to Operate

BDMS

Blood Donor Management System

BMBB/TS

Blood Management Blood Bank Transfusion Service

CAC

Common Access Card

CCA

Clinger Cohen Act

CFO

Chief Financial Officer

CIO

Chief Information Officer

COI

Critical Operational Issue

CONUS

Continental United States

COOP

Continuity of Operations Plan

CSF

Critical Success Factor

DBSS

Defense Blood Standard System

D&RS PMO Deployment and Readiness Systems Program Management Office
DAA

Designated Accrediting Authority

DIACAP

DoD Information Assurance Certification and Accreditation Process

DIN

Donation Identification Number

DoD

Department of Defense

DoDAF

Department of Defense Architecture Framework

DoD EA

Department of Defense Enterprise Architecture

DoDI

Department of Defense Instruction

EA

Economic Analysis
26

EBMS Inc 1 BDMS PIR Plan – Version 1.0

Acronym

Definition

EBF

Essential Business Functions

EBMS

Enterprise Blood Management System

EHR

Electronic Health Record

FD/SC

Failure Definition/Scoring Criteria

FDA

Food and Drug Administration

FD

Full Deployment

GESP

Global Information Grid Enterprise Service Profiles

GIG

Global Information Grid

GPRA

Government Performance and Results Act (GPRA)

IA

Information Assurance

iAS

identity Authentication Services

IATO

Interim Authority to Operate

IAW

In Accordance With

IOC

Initial Operational Capability

IPT

Integrated Product Team

JITC

Joint Interoperability Test Command

KPP

Key Performance Parameter

MESOC

MHS Enterprise Service Operations Center

MHS

Military Health System

MOE

Measure of Effectiveness

MOP

Measure of Performance

MOS

Measure of Suitability

MTBSA

Mean Time Before System Abort

MTTR

Mean Time To Repair

NIPRNET

Non-secure Internet Protocol router Network

NR

Net-Ready

OA

Operational Assessment

OCONUS

Outside the Continental United States

OMB

Office of Management and Budget

PEO

Program Executive Officer

27

EBMS Inc 1 BDMS PIR Plan – Version 1.0

Acronym

Definition

PIR

Post-Implementation Review

PMO

Program Management Office

POA&M

Plan of Actions and Milestones

ROI

Return on Investment

SIT

System Integration Testing

SME

Subject Matter Expert

SOP

Standard Operating Procedure

T=O

Threshold equals Objective

T&E

Test and Evaluation

UOS

User Opinion Surveys

WIPT

Working-level Integrated Product Team

28

EBMS Inc 1 BDMS PIR Plan – Version 1.0

APPENDIX F: REFERENCES
Defense Acquisition Guidebook (DAG), https://dag.dau.mil/Pages/Default.aspx
DoD Instruction 5000.02, “Operation of the Defense Acquisition System,” dated
November 25, 2013 (interim guidance)
Office of Management and Budget (OMB) Circular A-130, Chapter 8
Clinger-Cohen Compliance Guidance Clinger-Cohen Act (Title 40/CCA)
Government Performance and Results Act (GPRA) Modernization Act of 2010
Enterprise Blood Management System Acquisition Decision Memorandum, July 9, 2013
Blood Donor Management System Business Case v1.0, July 30, 2014

29


File Typeapplication/pdf
File Modified2014-08-15
File Created2014-08-07

© 2024 OMB.report | Privacy Policy