RIG Comment Reconciliation Chart

RIG FRN Reconciliation for OMB.xlsx

National Science Foundation Research Infrastructure Guide

RIG Comment Reconciliation Chart

OMB: 3145-0239

Document [xlsx]
Download: xlsx | pdf
Section Feedback Description NSF Reconciliation Statement
General I have provided comments throughout the draft RIG PDF file. That file is greater than 10 Megabytes so I cannot upload it here. I am forwarding that to my NSF program officer for review. Overall, excellent additions and reorganization. Including subsections describing the reasons for PEP sections, relevance, and "tips and tricks" type guidance will help many a subject matter scientist learn how to do project management as NSF needs.  No change. The comment was favorable.
General Overall the document is significantly improved compared to the 2021 version.  Several topics including Program Life Cycle Planning are presented in a much more organized and cohesive manner.   No change. The comment was favorable.
General Overall, RIG updates have increased my understanding of what is required for the components and subcomponents of the project execution plan. The breakdown of the components, their subcomponents, documents, and products are very helpful.

The descriptions, rationale, “how to guidance”, good practices and practical
considerations are immensely helpful. Below are some comments that may be useful in further refining the document. Thank you for your work on this document!
No change. The comment was favorable.
General This seems to place a large burden on the contracting community. Has the NSF received a FAR deviation to apply this guidance? Does this comply with the Procurement Publication Statute? If not, NSF should remove all references to contracts. The 2025 RIG is intended to cover all award instrument types used by NSF. Section 1.3.1 of the RIG states that "NSF has the statutory authority to use a variety of award instruments including financial assistance (grants and cooperative agreements [CA]), contracts, and Other Arrangements/ Transactions (OA/T) to fund scientific programs, RI, and to otherwise execute the agency’s mission." If there is any perceived or actual conflict with the acquisition statute, regulation, or policy, NSF will work with interested parties to tailor application of the RIG to ensure full compliance with these underlying authorities.
General Given the conflict between the FAR and this NSF document, can NSF clarify that the FAR governs when a contract is issues (e.g. EVM requirements) The RIG is agency guidance that is specific to certain types of Research Infrastructure awards. If there is any perceived or actual conflict with the acquisition statute, regulation, or policy, NSF will work with interested parties to tailor application of the RIG to ensure full compliance with these underlying authorities.
General This document references contracts and appears to be a procurement policy but does not appear to comply with 41 U.S. Code § 1707 - Publication of proposed regulations.

The RIG also appears to be part of the NSF financial assistance framework. How can in apply to contracts in lieu of FAR requirements (e.g., EVM, CO responsibilities)? If this does apply to contracts, will be there an associated deviation from the FAR or additional rulemaking?
The RIG is guidance that is specific to larger Research Infrastructure awards. NSF is not intending to create any new acquisition policy or regulation through the RIG. If there is any perceived or actual conflict with the acquisition statute, regulation, or policy, NSF will work with interested parties to tailor application of the RIG to ensure full compliance with these underlying authorities.
General I love the "key takeaway" callouts. No change. The comment was favorable.
General Not specifically limited to this section, but the “Key Takeaway” text boxes shouldn’t really be needed.  These sentences should just be the first sentence the sections, or clearly stated as a program requirement. No change. The comment will be taken into account for a future RIG revision.
General While "must" and "should" have been well defined, "may be considered" has not. Slight text modifications where appropriate. Language changed to "should" in 2.5.1.1 NSF Oversight and Conceptual Design Review and 1.3.1.2 Contracts, language clarified in 2.9.2 Expectations for Mid-scale RI Proposers and Awardees, and no change to instance in Lexicon.
General Ther Executive Order 14173, signed on 21 July 2025, mandated that acquisition, grant, and assistance procedures remove DEI-related language. In accordance with Executive Order 14173—Ending Illegal Discrimination and Restoring Merit-Based
Opportunity, Section 2, Policy, signed January 21, 2025, minor edits made throughout the document to remove references to diversity, equity, inclusion, and accessibility (DEIA).
1.1 Purpose and Scope Update RI definition to > Research Infrastructure (RI) is any combination of facilities, equipment, resources and services, instrumentation, computational hardware and software that fosters research and innovation in any field, and the necessary human capital in support of the same. The user base of RI must include members of the research community beyond a single lab or institution. Section 1.1 Purpose and Scope updated with new RI definition.
1.1 Purpose and Scope Musts & shoulds - Thanks for clarifying this. Super important to extinguish ambiguity early on. No change. The comment was favorable.
1.1 Purpose and Scope Will Section 1.1 be updated to remove the preference for Cooperative Agreements? It seems inconsistent with the FGCAA. Additionally, will this address case law related to the use of cooperative agreements. Section 1.1 contains no reference to cooperative agreements. Section 1.3.1 states that "NSF has the statutory authority to use a variety of award instruments including financial assistance (grants and cooperative agreements [CA]), contracts, and Other Arrangements/ Transactions (OA/T) to fund scientific programs, RI, and to otherwise execute the agency’s mission. The selection of award instruments is based on the primary purpose of the award, the beneficiary of the award, and other factors." There is no preference for cooperative agreements and this statement aligns with the Federal Grant and Cooperative Agreement Act.
1.1 Purpose and Scope Regarding this statement: “the term project specifically refers to the Construction Stage for Major Facilities”

Recommend changing “project” instead to “Construction Phase Project” since different terms such as “Science Support Program” are used for the Operations phase. There are many instances of the word “project” and all of them do not relate solely to the “Construction Phase for Major Facilities.” Using a generic term for something specific will cause confusion, especially since this document covers all facility stages and the project management methods to use for each.
No change. The comment will be taken into account for a future RIG revision.
1.3.1.1 Financial Assistance Awards  (Due to their complex nature...) This paragraph is a bit of a non-sequitur. Perhaps with this and the previous mention of two budget formats belong under a heading "special requirements for MF and mid-scale RI" financial assistance? No change. The comment will be taken into account for a future RIG revision.
1.3.1.3 Other Arrangements/ Other Transactions  (For RI OA/T…) OK, third time this two-format budget text is mentioned. I guess you need to be clear that it applies to ALL federal assistance, contracts, and OA/T The two-format budget only applies to financial assistance due to the current NSF award system. Text in Section 1.3.1.3 Other Arrangements/ Other Transactions, page 24, modified to read: "For all NSF awards, including OA/T, the budget must be submitted in an appropriate WBS format (see Section 4.2 Scope and Work Breakdown Structure)."
2.1 NSF Staff Roles and Responsibilities for Management and Oversight  Section 2.1.1: “However, it acknowledges that variations may arise due to the distinct characteristics of each facility. These deviations reflect the unique requirements and challenges associated with the specific goals and operational demands of individual projects and Science Support Programs. The flexibility inherent in the guidance ensures that the framework can adapt to the varied nature of each facility while maintaining alignment with overall scientific objectives.”

It would be useful to be more specific about the “variations”. How are these handled in terms of NSF decision making and approval requirements (what role makes the decisions)

Where should they be captured? The CA? Program Documentation?
Paragraph changed to read: "The Research Infrastructure Guide (RIG) outlines the typical processes and expectations for each life cycle stage of Major Facilities and Mid-scale Research Infrastructure (RI). However, variations may arise due to the technical requirements and challenges associated with a project or the objectives and operational characteristics of a specific Science Support Program. The RIG provides inherent flexibilities so that proposals and awards can be adapted to the unique nature of each facility while maintaining sufficient agency oversight."
2.1 NSF Staff Roles and Responsibilities for Management and Oversight  The wording here seems redundant and vague:
“Sponsoring Directorate. The NSF Sponsoring Directorate division? Person? that proposes the project and is committed to funding the pre-construction development and design activities, eventual operations as a Science Support Program, and final disposition. The senior management within the Sponsoring Directorate considers community inputs, discipline-specific studies, advisory committee recommendations and internal NSF factors to prioritize candidate projects, balancing risk with opportunities and competing demands for available resources.”

Overall, this section does communicate NSF’s structure and oversight framework but could be improved by editing content for brevity and clarity.
Changed to read: "The NSF organization that proposes the project to NSF Leadership…"
2.1 NSF Staff Roles and Responsibilities for Management and Oversight  This section and figure (2.1.1-1) are really excellent.  This oversight was always unclear previously. No change. The comment was favorable.
2.1.2 Coordinating and Advisory Bodies Agree with the change of moving this topic from section 6 to section 2.

Recommend keeping the roles and responsibilities as described in the 2021 RIG. Overall, this section has added a lot of wording as compared to the 2021 RIG, but has not improved clarity. Examples below:

New Description example:

“Chief Officer for Research Facilities (CORF). This senior executive resides within the Office of the Director, reports directly to the NSF Director, and has full life cycle oversight responsibility for NSF major research facilities.1 The CORF advises the Director on all aspects of NSF Major Facilities and Mid-scale RI throughout their life cycles and collaborates across NSF on the oversight of the NSF research facilities portfolio. The CORF is a member of the agency’s Executive Leadership Team, liaison to the National Science Board’s (NSB) Awards and Facilities Committee and chairs the Facilities Readiness Panel and the Facilities Governance Board. A deputy CORF focuses on oversight of the Mid-scale RI portfolio and chairs the Major and Mid-scale Facilities Working Group (MMFWG).”

Previous Description example:

“Chief Officer for Research Facilities (CORF) – The senior official who advises the NSF Director on all aspects of the agency's support for major and mid-scale research facilities throughout their life cycle and collaborates with NSF employees involved in oversight and assistance of the NSF multi-user research facilities portfolio.”
The RIG was changed to: “Chief Officer for Research Facilities (CORF). As required by statute, this senior executive has oversight responsibility for NSF major facilities across their full life cycle.   The CORF advises the Director on all aspects of NSF Major Facilities and Mid-scale RI, collaborates across NSF on the oversight of the research infrastructure portfolio, and is a liaison to the National Science Board (NSB) Committee on Awards and Facilities.  The CORF organization resides within the Office of the Director and is staffed to support these oversight activities.”   The reference to Deputy CORF was deleted in Section 2.1.2 Coordinating and Advisory Bodies.
2.3.3.2 Design Stage Section 2.3.3.2 Design Stage > Preliminary Design Phase: Re this statement: The primary deliverables for the PDR are an updated PEP along with an estimated cost and DEP for the Final Design Phase.

The RIG should list the Entry and Exit Criteria and Review Artifacts for the CDR, PDR, and FDR.
Changed to read: "The primary deliverables for the PDR are an updated and progressively elaborated PEP, including the revised estimated total project cost, and DEP for the Final Design Phase."
2.7.3 Operations Stage Reporting and Oversight  Questioning what happened to the ICA material in the RIG, and Financial Data Collection Tool. Incurred Cost Audit content is clarified in the RIG, and supplemental content added to RIO webpages.
2.7.4 Recapitalization During Operations Section 2.7.4 Recapitalization During Operations provides guidance on recapitalization mechanisms.

This is a welcome addition to the RIG. This section could be improved by outlining the submission and approval process.

What are the requirements for the Asset Management Plan?

Provide an approval chain of command after: “Close consultation with the NSF PO is essential in determining the most appropriate funding mechanism based on the availability of funds and other factors.”
The submission and approval process varies depending on the program, mechanism used, and the source of funding. These are details internal to NSF and not appropriate for the RIG. This is addressed by recommending close coordination with the NSF Program Officer.
2.9 Mid-scale Research Infrastructure Guidance Section 2.9 Mid-scale Research Infrastructure Guidance is relocated from Chapter5; clarifies and differentiates guidance for Mid-scale RI from Major Facilities.

This section calls out using and tailoring Program Management techniques and how to document them. Suggest providing a table or bulleted list of required plans and documentation.
No change. Table 2.9.4-1 Requirements for Major Facilities versus Mid-scale RI, lists the required plans and documentation for Major Facilities and Mid-scale RI.
2.9 Mid-scale Research Infrastructure Guidance "This section should not be interpreted as standalone, comprehensive guidance for Mid-scale RI. Rather, it should be viewed as a complement to all other relevant sections of this Guide." > Reading the word "relevant" is a bit worrying, mostly because it sounds like deciding something is relevant may be left up to the reader. I'll read on and see if that ambiguity is retired. Language clarified, updated "relevant" to "applicable."
2.9.2 Expectations for Mid-scale RI Proposers and Awardees  Schedule Development: These sentences that define the "four characteristics" are really helpful. No change. The comment was favorable.
2.9.3 Mid-scale RI Life Cycle Stages  Mid-scale RI Disposition Award. Best description of disposition I've ever seen No change. The comment was favorable.
2.9.4 Summary of NSF Oversight for Major Facilities and Mid-scale RI Table 2.9.4-1 Requirements for Major Facilities versus Mid-scale RI - Excellent summary table. No change. The comment was favorable.
3.2 Tailoring, Scaling, and Progressively Elaborating Plans  This section is really excellent, to the point that it will teach scientists (and other) how to think about project management as a necessary tool for success. No change. The comment was favorable.
3.2 Tailoring, Scaling, and Progressively Elaborating Plans  Tailoring: The process of selecting an appropriate framework to define and organize
the scope, management, organization, schedule, cost detail, and performance
measurement methods.
>> Can you provide a simple example of selecting an appropriate framework to define and organize the scope, management, organization, schedule, cost details, and performance measurement methods?
No change. The comment was favorable.
3.2 Tailoring, Scaling, and Progressively Elaborating Plans  Scaling: The process of adjusting the level of detail, degree of formality, tools, and
management
>> Can you provide a simple example of adjusting the level of detail, degree of formality, tools, and management processes?
No change. The comment was favorable.
3.2.2 Scaling Level of Detail. Simple projects or programs might only develop the Work Breakdown
Schedule (WBS) to Level 3, which is considered the minimum by industry good
practices. In contrast, large construction projects may extend to WBS Level 10 in some areas to capture the work packages in the appropriate detail for cost estimating and monitoring performance.
>> Can you provide an example describing a WBS to Level 3 and Level 10?
Text "see Section 4.2 Scope and WBS" added for clarity and further guidance.
3.2.2 Scaling Management Processes. Performance management processes also have varying
degrees of formality. For example, NSF oversight requires a Major Facility to have an
EVM system that is verified, accepted, and has periodic surveillance reviews during
construction. In contrast, a Mid-scale RI project electing traditional waterfall methods can use a system to monitor progress against the plan using its own institutional standards or something as simple as weighted-milestone tracking (see Section 4.5 Monitoring Progress Against Plan). For operations, the management process may be handled though routine activity status reporting to NSF with actual costs against the proposed budgets for each operational WBS element.
>> Please clarify managing progress against the plan versus managing operations. Can you provide an example?
No change. The comment will be taken into account for a future RIG revision.
3.3 Development Stage Planning Section 3.3 states: “There are no standard required plans for the Development Stage.” And “Any formal plans required as deliverables would be described in the funding announcement”

Some requirements for “formal plans” provided by NSF and “standard plans” should be developed such as the recommended “Master Plan” and “Key Performance Parameters” called out in the same section.
No change. The comment will be taken into account for a future RIG revision.
3.4 Design Stage Planning The Design Execution Plan (DEP) in section 3.4 shows a structure outline of ten (10) components. These components are essentially identical to the 10 components of a Construction PEP, which is shown in section 3.5. The problem is that the 10 shown in the DEP are in a different order than those of the PEP. I strongly urge you to re-order the DEP section outline to match the same order as the Construction PEP. I know that there was a lot of thought and care put into the order of the PEP, and it's unclear why the DEP was rearranged in a different order. The risk management component of the DEP was moved up, so now the outline of the DEP aligns with the PEP.
3.4 Design Stage Planning With a lower bound of $400K for a Mid-scale RI-1 Design project and an upper bound of over $100M for Major Facilities, the 374-page RIG covers an expansive range of projects and it’s not clear that it should apply uniformly across this full range. By covering this full range, it creates undue burden for projects at the lower end. There are two specific cases that warrant some consideration. Case 1: Mid-Scale RI-1 Design projects at lower funding levels. Per the Mid-scale RI-1 solicitation, all Design projects, including those at the minimum $400K funding level, require a Design Execution Plan (DEP) in accordance with the RIG. However, the RIG provides just over 2 pages on the DEPs, leaving it to the proposer to determine what level of detail from Section 3.5 of the RIG to include. It would be highly beneficial for Design projects up to a certain level, say $4M, to have standardized templates that delineates specifically what the minimum requirements are for the DEP. Additionally, it’s worth considering whether a DEP should be required at all for Design projects at lower level, say $1M and below. As an alternative, consider adding a requirement to address this in the Project Description itself and remove the requirement for the DEP supplementary document. Another option is to eliminate the need for a DEP for preliminary proposals requesting $1M or less and require it only should the proposal be recommended for a full submission. Case 2: Mid-scale RI-1 vs MRI. A Mid-scale RI-1 $4M Implementation project requires a Project Execution Plan whereas a $4M Major Research Infrastructure (MRI) project does not. In the various NSF communications about Research Infrastructure, the gap between MRI and MREFC is seen as being filled by the Mid-Scale program, creating a continuum. However, the step function in project management requirements across these two programs at the $4M boundary seems both arbitrary and a disincentive for organizations who want to field a Mid-scale RI-1 at the lower level. An alternative would be to exempt Mid-scale RI-1 Implementation projects in the range of, say $5M or less, from requiring a PEP, or, as in the case above, provide a standardized template and associated guidance.   No change. As stated throughout the RIG, all plans (including the DEP) should be scaled and tailored to the project or program. Proposing organizations should use professional judgement, base on their understanding of what is being proposed. Any exemptions to the RIG would be noted in the funding announcement.
3.4 Design Stage Planning 3.4 Design Stage Planning. Formulates the DEP detailing tasks. Major Facilities
undergo submission for the Conceptual Design Review and Preliminary Design Review
in preparation for the Final Design Phase. The Mid-scale RI DEP is reviewed as per the
funding announcement.
>> The guidance here is unclear. Is the guidance intended to state that Major Facilities applications will include a Concept Design Review and Preliminary Design Review in preparation for the Final Design Phase?
Text edited for clarity.
3.4 Design Stage Planning 3.4 Design Stage Planning. The Mid-scale RI DEP is reviewed as per the
funding announcement.
>> Is a DEP required as per the funding announcement but may not be reviewed or is this saying a Mid-scale RI DEP may be required and reviewed as per the funding announcement?

Removed text in question and updated text on the Chapter landing page to align with other introductory descriptions: "Design Stage Planning. Formulates the DEP that outlines the specific tasks, milestones, resource requirements, timelines, and responsible parties necessary to carry out the Design Stage."

For clarity (no RIG changes), Mid-scale RI Awardees must submit a DEP for NSF review as required by the funding announcement or program guidance.
3.4.1 Design Execution Plan 3.4.1 Design Execution Plan
>> Please clarify if the DEP is always required or if the solicitation will indicate if this component is required or not.
Text edited for clarity using plain language, so it now reads "The Awardee must address all ten components, and any proposed subcomponents should be tailored and scaled appropriately for the Design Stage."
3.4.1 Design Execution Plan NSF Requirement Major Facilities and Mid-scale RI projects must create a PEP,
including all components and subcomponents, tailored and scaled appropriately for the Construction Stage or implementation. >> Consider making this language consistent as it related to non applicable subcomponents with the guidance at the end of page 97 which states:
“Each PEP component is required, regardless of project size, but some subcomponents may not be applicable for all projects. Proposers must address all components and subcomponents and may indicate Not Applicable for any that do not apply and provide a rationale for that determination. “
Changed to read: “The DEP leverages the 10-component format of the PEP described in Section 3.5 Construction Stage and Implementation Planning. The Awardee must address all ten DEP components outlined below and any proposed subcomponents should be tailored and scaled appropriately for the proposed design activities.”
3.5-1 PEP Overview Map Figure 3.5-1 PEP Overview Map >> Does the shading of certain components and sub-components hold significance? If this is only for aesthetics I suggest adding a note next to the figure description indicating that the color shading does not convey any significance. Text added for clarity - "Shading in table is for improved readability."
3.5.1   PEP Component 1 – Project Overview Table 3.5.1-1 Project Overview Subcomponents, Products, and Documents with References to Future Material and Related Topics - Project Mission Statement should be in accordance with the award instrument used.
>> An example of a mission statement that is in accordance with the award instrument used would be helpful here. I am not clear how the award instrument would affect the project mission statement.
Text in table edited for clarity.
3.5.2.2 PEP Subcomponent 2.2 – Internal Project Organization  (Project that are cyclical…end of paragraph) wrong figure reference. should be 3.5.2.2-2 again. Figure reference updated to 3.5.2.2-2.
3.5.2.2 PEP Subcomponent 2.2 – Internal Project Organization  (Fig 3.5.2.2-2) this figure caption is unintelligible Figure caption updated.
3.5.2.3 PEP Subcomponent 2.3 – External Project Stakeholders  wrong figure reference, should be 3.5.2.3-1 perhaps? Figure reference updated to 3.5.2.3 2.
3.5.3.1 PEP Subcomponent 3.1 – Overview of the Performance Measurement Baseline and Total Project Definition  TPD is NOT defined on the gold card. No changes to RIG. "TPD" added to NSF Gold Card.
3.5.3.1 PEP Subcomponent 3.1 – Overview of the Performance Measurement Baseline and Total Project Definition  (The Subcomponent overview…) the parenthetical definitions in this paragraph are better than the sentences used in the previous paragraph. Perhaps move these forwards... No change. The comment will be taken into account for a future RIG revision.
3.5.3.1 PEP Subcomponent 3.1 – Overview of the Performance Measurement Baseline and Total Project Definition  (A time-phased…) the word "obligation" means different things to different people. please be more specific or give example. Updated three instances of the word "obligation" for clarity—replacing it with "requirement" in Sections 2.9.2 (Expectations for Mid-scale RI Proposers and Awardees) and 3.5.2.2 (PEP Subcomponent 2.2 – Internal Project Organization), and removing one instance from Section 5.3.8 (Data Management and Curation).
3.5.3.1 PEP Subcomponent 3.1 – Overview of the Performance Measurement Baseline and Total Project Definition  (A time-phased…)Table 3.5.3.1-2 Commitment and Funding Profile by FY Sample Table does not give an example of TPC_AWD. It shows TPC_EVM. Perhaps add a row to table showing an optional "fee", or a row below TPC_EVM showing optional fee and a row below that showing a total sum of TPC_AWD. Text edited to align with table.
3.5.3.2 PEP Subcomponent 3.2 – Scope  Good Practices and Practical Considerations: Nouns versus verbs. Excellent advice. No change. The comment was favorable.
3.5.4 PEP Component 4 – Risk and Contingency Management  Very clear directions. Pretty much a "how-to" get it right. No change. The comment was favorable.
3.5.4 PEP Component 4 – Risk and Contingency Management  "Why Is This Component Important?" - These "why important" sections are really valuable guides.  No change. The comment was favorable.
3.5.4 PEP Component 4 – Risk and Contingency Management  "How To Develop and Write This Component" - These "how to" sections are also extremely helpful guides No change. The comment was favorable.
3.5.4.3 PEP Subcomponent 4.3 – Contingency Management Plan  "Liens List: Forecasting and Opportunity Management" - Excellent definition. No change. The comment was favorable.
3.5.4.3 PEP Subcomponent 4.3 – Contingency Management Plan  "Good Practices and Practical Considerations" - Again, really excellent advice to give "good practices" as suggestions. No change. The comment was favorable.
3.5.5 Acquisition Plans The scope of section 3.5.5 "Acquisition plans", specifically 3.5.5.3 (System Engineering and Quality Management Plans) and 3.5.5.4 (Resource Management Plans) seems poorly defined in relation to the overall PEP. The two mentioned sections appear not confined to "Acquisitions" and to require duplication of the project's organization & roles (3.5.2), quality acceptance (3.5.3.3), risk (3.5.4), etc. Revision and clarification of this section is an urgent need. As written I am left to wonder whether these sections should not have been integrated into earlier sections, been removed, or explicitly confined to major "Acquisitions". No change. The comment will be taken into account for a future RIG revision.
3.5.5.3 PEP Subcomponent 5.3 – Systems Engineering and Quality Management Plans  (Fig 3.5.5.3-1) Opinion: the order is wrong. We think that science requirement come before KPPs. Image updated such that Science Requirement comes before KPPs.
3.5.5.3 PEP Subcomponent 5.3 – Systems Engineering and Quality Management Plans  (Separate awards are..) This is quite profound. You are saying that the construction project is not yet complete but pre-award operational funds, or during award operational funds, will be used for transitioning. This is very practical but not often understood because of the strict wording of "segregation" mandates. No change. Proper segregation of funds is essential and achieved by making two separate awards; one for construction and one for initial operations. Proper allocation of costs between these awards is based on the Awardee's Segregation of Funding Plan.
3.5.5.4 PEP Subcomponent 5.4 – Resource Management Plans  This is a good addition. We hadn't done these in the past, in any formal sense, but had internal plans and had recognized as a risk that resource availability was an assumption.

We had also included this in a "risk analysis" appendix to the PEP. I like putting it inside PEP.
No change. The comment was favorable.
3.5.7.1 PEP Subcomponent 7.1 – Overview of Project Controls  Figure 3.5.7.1-1 Project Controls Process Flow Chart: Interactions Among Subcomponents with Established Total Project Definition > One thing I've found valuable here is thinking about the "pre-award' activities as distinct from post award. It is valuable for the project management team to plan for how the proposal activities (i.e. basis of estimate, etc.) flow into tracking during post award. In this figure, the "project definition" box is what I'd call "pre award". No change. The comment was favorable.
3.5.7.2 PEP Subcomponent 7.2 – Performance Measurement and Management Plans The selection of Project Controls tools depends upon the chosen PMM method, which should be tailored and scaled to meet project needs. For example, Major Facilities construction projects must use verified Earned Value Management (EVM) as the PMM method, which entails the use of tools such as EVM software applications and involves adherence to NSF Earned Value Management System (EVMS) guidelines...
>> This is very helpful! Can NSF provide examples of non-verification EVM that can be used for simpler projects or an example of the layout of a spreadsheet that would meet the requirement for a PMM tool?
No change. The comment was favorable.
3.5.7.3 PEP Subcomponent 7.3 – Change Control Plans  "Change vs Configuration Control" > This is very good to define this distinction. No change. The comment was favorable.
3.5.7.3 PEP Subcomponent 7.3 – Change Control Plans  Table 3.5.7.3-1 Sample Change Request: Approval Thresholds and Authorities for a Medium Complexity Major Facility Project > This is a very good example table. I would have used this verbatim. No change. The comment was favorable.
3.5.9 PEP Component 9 – Project Closeout Plans  "Project Closeout Plans" > Excellent definition. No change. The comment was favorable.
3.6 Operations Stage Planning Section 3.6 Operations Stage Planning contains improved guidance on the Annual Work Plan and Facility Condition Assessment of a Major Facility.

The new Operations Stage Planning section of the 2025 Research Infrastructure Guide, Section 3.6, under public discussion in NSF programs from 2022 through 2024, represents a substantial advancement in the depth and quality of guidance for major facility operators over previous requirements stated in RIG 2.5.1; Cooperative Agreement – Award Specific Financial Administrative Terms and Conditions (FATC); and Modifications and Supplemental FATC ¶85. The new §3.6 provides a strong conceptual structure including specific elements sufficient to enable development of a detailed approach to the scope and processes for conducting, reporting and using the results of a Facility Condition Assessment (FCA) and the Asset Management Plan (AMP). Our organization was able to develop a detailed initial draft FCA process based on the guidance as presented in drafts NSF made available to the major facility community through June 2024.
No change. The comment was favorable.
3.6 Operations Stage Planning Continued from comment above: Sections 3.6.2.1 and 3.6.2.3 discuss identifying “resources needed” and corresponding timelines to address FCA Report issues as well as cost estimates within the FCA Report and Asset Management Plan. Section 3.6.2.4, “Identify Funding Needs” requires submission of annual estimated costs to execute the Asset Management Plan. NSF-conducted public presentations on conducting Facilities Condition Assessments indicated cost estimates likely would be expected at a quality/accuracy level of concept or feasibility studies. (See How to Conduct FCAs – P. Anticona, RI Workshop, June 2023). It is understood that there is a substantial difference in labor, documentation, and preparation time between cost estimates and schedules developed under §4.3 that support specific financial assistance proposals and industry-standard practices for cost estimates and timelines at a concept or feasibility study level. Although §4.3 suggests that a major facility operator “tailor and scale” GAO and NSF cost estimating and scheduling guidance to its facility, “departures” from the RIG and GAO estimating structures must be noted and justified. We suggest that specific guidance on NSF’s expectations for FCA/AMP cost estimating and schedule standards be included within §3.6, or include the potential for collaborative review of the FCA results (that may include feasibility study level cost estimates and related timelines) to inform successive steps toward more detailed estimates culminating in meeting the standards under §§4.3 and 4.4 that would provide a sufficient basis for funding proposal alternatives under §§2.7.2, 2.7.4. This approach appears to be consistent with guidance under §3.2.

Both CA FATC (§1.8) and RIG (§3.6.2) require that the facility condition assessment address potential vulnerabilities and needed upgrades against natural hazards anticipated from climate change, as well as mitigations through resilience to climate change. In view of recent events, revised or additional guidance seems appropriate on the future natural hazards major facility operators should anticipate and address in the facility condition assessment.
NSF expects cost estimates submitted to NSF for planning/budgeting purposes as part of the Asset Management Plans to be ROM estimates. Only when submitted as part of the Annual Work Plan or separate proposal for actual funding should the estimates comply with GAO characteristics for a high-quality cost estimate (well-documented, accurate, credible and comprehensive). Text edited in Section 3.6.2.4 for clarity.
4.2 Scope and Work Breakdown Structure  Table 4.2-3 Example Format of an Activity-based Operations WBS > Giving these different style WBS examples is instructive and very useful to help strategic planning. No change. The comment was favorable.
4.4.3.1 Ten Steps to Develop Baseline Schedule  Very helpful "how to". No change. The comment was favorable.
4.5 Monitoring Progress Against Plan Suggest rewording the sentences that begin with “NSF aims to…” and “NSF also aims to…” to be less ambiguous. (Word p.50) No change. The comment will be taken into account for a future RIG revision.
4.5.4.1 Earned Value Management – The Seven Principles Recommendation for Improvement: Section 4.5.4.1 – Earned Value Management, Principle 4 The current guidance states: “Use actuals incurred and performance attained in accomplishing the work performed.” To strengthen the integrity of Earned Value Management (EVM) reporting, the RIG should be more prescriptive in requiring that actuals for labor be based on the actual hours spent on the specific work package, rather than relying on pre-allocated estimated effort used for budgeting. In many cases—especially within university-led Mid-Scale projects—there is no established practice of recording the actual time spent on tasks within a work package. Instead, accounting systems often apply the planned allocation as actual cost, and individuals are only asked at the end of a semester whether they worked the estimated hours, often without a formal reconciliation step. This practice leads to inaccurate cost variance and cost performance index (CPI) calculations, diminishing the effectiveness of EVM. Without a clear requirement to track and report actual time spent, the value of EVM as a performance measurement tool is significantly weakened.
>> Recommendation: The RIG should explicitly mandate that labor actuals be based on recorded hours worked per work package, rather than pre-allocated estimates. Implementing this requirement would improve the accuracy of cost performance metrics and strengthen EVM’s role in project oversight.  
No change. The comment will be taken into account for a future RIG revision.
4.5.4.2 Verified EVMS 4.5.4.2 Verified Earned Value Management Systems. The NSF aims to complete the Compliance Evaluation Review before awarding construction funds. Additionally, NSF intends to accept the project's EVMS before physical construction or major acquisitions begin, contingent on the satisfactory resolution of findings from the Compliance Evaluation Review. Having participated in four of the eight NSF EVM Verifications and numerous surveillance reviews, I strongly recommend that a Compliance Evaluation Review be conducted around the time of the Final Design Review (FDR). This review would serve as a preparatory step for the performing organization and Control Account Managers (CAMs), exposing them to the rigorous line of questioning they will face during the EVM Verification Review. Despite extensive preparation, CAMs often do not fully appreciate the depth and nature of the NSF EVM Review Team Panel’s inquiries until they experience the actual review. Furthermore, I recommend scheduling the formal EVM Verification Review approximately 6 to 9 months after the award date. This timing would enhance the review process by allowing for the evaluation of actual project data in addition to assessing compliance with EVM processes, leading to a more meaningful and effective verification.  The steps in the evaluation review process remain unchanged in the RIG. However, the term "compliant" was replaced with "following guidance" in 4.5.4.1 Earned Value Management – The Seven Principles and 4.5.4.2 Verified Earned Value Management Systems. Additionally "Compliance Evaluation Review" was replaced with "EVMS verification review."
4.5.4.3 Non-Verified EVMS  Figure 4.5.4.3-1 NSF Scaled (Non-Verified) EVMS Process: Application of 18 of 32 EVMS Guidelines Aligned with the 7 Basic Principles of EVMS > This figure is excellent, easy to understand and follow. No change. The comment was favorable.
4.6 Risk Management  Table 4.6-1 Risk, Opportunity, and Threat per RIG Definition > I don't see you using "threat" in this section, but rather risk with definite negative impact. I'd either leave out "threat", or, if you feel that is common vernacular, include it as a parenthetical phrase in the definition of "risk". Revised one instance of "negative risk" to the term "threat" for consistently.
4.6.2 Step 2 – Identify and Document Risks  Table 4.6.2-1 Sample Risk Register > Excellent example for new PMs. Note that some will not use "status", but rather have a probability of occurrence called "retired". that's a convenience for sorting the table by probability. No change. The comment was favorable.
4.6.2 Step 2 – Identify and Document Risks  Figure 4.6.2-1 Risk Breakdown Structure Example > I've seen, we use, an RBS that applies the risks in the risk register to categories in the project WBS. Just a different way of looking at things, but it makes it easier to create change request forms that linked risks to WBS categories. Also, viewing it that way shows which elements of the WBS are susceptible to the elements in the risk table. No change. The comment was favorable.
4.6.6.2 Parametric Method I am really glad to see section 4.6.6.2 Parametric Method – Risk Factor Analysis in this version. This is an excellent practice for bottom-up cost estimating.  No change. The comment was favorable.
4.7 Contingency Estimating and Management Section 4.7 Contingency Estimating and Management is decoupled from risk management.

It’s interesting that this section specifically calls out: “NSF Requirement: Major Facility projects must use a combined cost and schedule risk analysis using Monte Carlo methods and select a value within the 70-90% confidence range.”

All the other sections allow for various methods for other project management processes, but this one is setting a hard requirement by stating “must use”. By stating this as a “Shall” requirement it’s at the same level of importance as using EVMS during the construction phase.

There are alternatives such as a decision tree analysis.
Three instances of "shall" in Section 4.2.2. Characteristics of a Reliable Schedule were changed to "should". Current text explains why the 70 - 90% range and the use of Monte Carlo; no change.
4.7.1 Allowable Contingencies  Figure 4.7.1-1 Methods for Determining Contingency Allocations > This is an excellent figure, showing succinctly what considerations belong in what plan. I definitely love that quality is added to scope considerations, because we're often making decisions that still delivery scope, but do so with varying degrees of quality. No change. The comment was favorable.
4.7.2.3 Scope Contingency  Figure 4.7.2.3-1 Comparison of Risk Exposure to Available Contingency Over Time > This is an excellent and profound figure. I'm going to emulate this ASAP.

The only issue I have with it is that the Risk Exposure is unlikely, on most projects, to vary that smoothly. To be consistent, when a major risk is realized, the risk exposure should change abruptly.
No change. The comment was favorable.
4.7.3.1 Contingency Management Controls  Table 4.7.3.1-1 Baseline Change Control Authority Levels > Again, an excellent table that should be copied into all PEPs. No change. The comment was favorable.
4.7.3.1 Contingency Management Controls  Key Takeaway box: "Like EVM itself, use of budget contingency is strictly a reporting (paper) exercise to show movement of budget, not a financial/accounting exercise. Budget contingency is never shown as an actual cost or expense." > Please rethink this, or add to it. Budget contingency is a paper exercise, but once budget contingency is allocated to the budget, or the allocation process itself, is/are financial/accounting exercises. Takeaway box text changed to read: ""Like EVM itself, use of budget contingency is strictly a reporting (paper) exercise to show movement of budget, not a financial/accounting exercise in terms of tracking actual costs. Use of budget contingency is an estimate of potential future costs based on realized risks, where the actual costs may not be incurred for weeks, months or years."
4.7.3.3 Contingency Management Forecasting  (If contingency was authorized…) WHAT! allocation of contingency does change the budget and therefore BAC, and has to. It might also change EV if the variance was a cost correction to tasks that are previously completed. This statement could be the source of lots of confusion.

Oh, OK, I see below that you allow that variance to be tracked and kept in the Liens List. Not sure why you allow that complication, because, in the end, BAC should be reflected and be the target cost to complete all project scope.
The text was clarified to "The sum of the EAC and liens should include variances (backward-looking actuals) and updated estimates (forward-looking forecasting) in the current plan." and redundant text was removed for further clarity.
5.3 Information Assurance RISC Comment #1: NSF should continue to emphasize taking a programmatic, risk-based approach to cybersecurity.  No change. The comment was favorable.
5.3 Information Assurance RISC Comment #2: NSF should return to its approach of identifying high-level cybersecurity guidance and concerns in the RIG, and increase its use of other mechanisms for discussing more detailed cybersecurity recommendations.  No change. The current approach ensures that Awardees have clear expectations for cybersecurity practices while allowing flexibility for implementation based on their specific operational environments.
5.3 Information Assurance Comment #3: NSF should avoid the term “Information Assurance” and return to using “cybersecurity” to set the scope of Section 5.3.  No change. Retaining “Information Assurance” ensures comprehensive guidance across cybersecurity and organizational security policies.
5.3 Information Assurance RISC Comment #4: NSF should align the language in Section 5.3 with the usage laid out in Section 1.1, removing the words “obligation,” “expectation,” “requirement,” “essential,” and other similar words.  The language was clarified to better distinguish mandatory requirements from recommended best practices, providing clearer compliance guidance. Terms such as "obligation," "expectation," "requirement," and "essential" were reviewed and, where appropriate, replaced with "should" and "must."
5.3 Information Assurance RISC Comment #5: NSF should remove the “NSF Critical Controls” and refocus its guidance on adopting and evaluating against a baseline control set.  No change. NSF staff use the critical controls as a baseline for good practices and to assess the adequacy of awardee systems.
5.3 Information Assurance RISC Comment #6: NSF should clarify or remove references to “phishing-resistant” multifactor authentication. The term "Phishing" was updated to "Phishing is considered a technique for attempting to acquire sensitive data through fraudulent computer-based means." (https://csrc.nist.gov/glossary/term/phishing)
5.3 Information Assurance RISC Comment #7: NSF should clarify that existing documentation can be used to satisfy the requirements in Section 5.3.  No change. The guidance allows flexibly for Awardees to map existing documentation to their unique IAMP.
5.3 Information Assurance RISC Comment #8: NSF should clarify the specifics of the Program Assessment in subsection 5.3.11.  Context was added to:
•Clarify that the assessment will be tailored and scaled to align with the Program
•Make the Program Assessment is specific to the IA Program
•Clarify the issue around "significant, reportable incidents being avoided"
5.7.1 Key Personnel  It is really helpful to define these roles and responsibilities.  Many projects have their own ideas which leads to chaos. No change. The comment was favorable.
5.9 Agile Guidance Section 5.9 Agile Guidance provides new content and guidance on applying Agile methodologies to NSF awards.

Interesting that this guide calls out “Agile Development Methods”. That means NSF Program Officers need to accept scope adjustments and uncertainty as well as WBS flexibility. I agree with Agile being an option, but this seems like a big jump from the last guide.
No change. The comment was favorable.
5.9 Agile Guidance I haven't the time nor expertise to review this section. More power to you for giving examples of flexibility in PM. No change. The comment was favorable.
Other It would be interesting to see NSF’s “Internal Management Plan” (section 2.2).

There are many new references to Agile, including an entirely new section. The additional section was highlighted under the Significant Changes, but “Agile” appears 99 times throughout the entire document.

The RIG needs to provide more guidance related to the “what,” “who,” and “how’ of Facility Commissioning including (1) Can portions of a facility be commissioned?, (2) What if the verification/validation test results do not match the Plan?, (3) What is the process?, and (4) What is the role of the PO in Commissioning?

Requirements management and verification and validation activities need to be spelled out.
No change. The recommendations will be considered for next RIG revision.
Other Question about FRN FTE counts and scope No change. Explanation sent via email during public comment period.
File Typeapplication/vnd.openxmlformats-officedocument.spreadsheetml.sheet
File Modified0000-00-00
File Created0000-00-00

© 2025 OMB.report | Privacy Policy