Summary of Public Comments

Appendix A. Summary of Public Comments and Consultation.docx

GGRF Accomplishment Reporting (NEW)

Summary of Public Comments

OMB:

Document [docx]
Download: docx | pdf

GGRF ICR Summary to Comments

The purpose of this document is to provide a comprehensive summary of comments provided by public commenters in response to the GGRF Accomplishment Reporting ICR (through the EPA docket), as well as comments provided by consultees in discussions held as part of the ICR burden consultation. It should not be assumed that this summary reflects an exhaustive list of each specific argument brought up by public commenters or consultees.

The summary of public comments and feedback from consultation is combined in the table below.

Comment Name and Description

Feedback

(Summary)

Organization #1

Through public comment and in discussion/written feedback provided as part of burden, Organization #1

recommended:

  • Providing more time for quarterly reporting (45 days)

  • Allowing for proxies and estimates based on defined data and standards. Requiring data from originating lenders that is beyond standard loan applications would significantly restrict the universe of originating lenders participating in the program.

  • Allowing for N/A and unknown picklist selections for certain data fields and certain loan types and sizes.
    In the Data Dictionary, the EPA should indicate the definition, method (e.g., projected or actual), and intended frequency of reporting (e.g., quarterly or annual, one-time or ongoing) for each metric, being aware of the additional burden of quarterly reporting.

  • Streamlining LIDAC classification and evaluation. Recommend a consolidated mapping tool for eligibility screening (on 4 categories) similar to CDFI IMS, or a consolidated CSV of CEJST census tracts and EJScreen census blocks.

  • Clarification on terminology and frequency of reporting. Clarify third-party validation, verification and assurance.


Organization #1 also expressed concerns on labor reporting. Metrics need to be reported on best-effort basis. Rate of pay and total hours worked will be difficult at start. Recommend providing further guidance around DBRA and BABA’s applicability, especially for small projects (e.g., under $3 million in federal funds invested or under 1MW of solar installed).

Organization #2

Through public comment and in discussion/written feedback provided as part of burden, Organization #2 recommended:


  • Standardizing a level of uniformity in data collection, measurement, and reporting to protect the integrity of the data.

  • Requiring all award winners to meet and confer with EPA to determine the best way to build out and apply a standardized system.

  • Developing new technology to enable uniform reporting for recipients and sub-recipients.

  • Implementing short-term modifications for response times to allow for a “learning up” period.

  • Modifying AVERT, COBRA or other models to measure the emissions benefits of smaller projects.

  • Requiring a standardized reporting method for economic impacts of investment.

  • Reevaluating burden analysis, which was vastly underestimated in the first Federal Register notice.


These proposed considerations will help create consistency in reporting and enhance credibility

Organization #3

Through public comment and in discussion as part of burden, Organization #3 recommended:


  • Providing definitions of important GGRF programmatic terms in the award agreement and compliance materials so award recipients and subrecipients are clear on project eligibility and expected compliance tracking. This will also ensure that EPA is receiving standardized reporting across awardees and that entities wishing to receive assistance understand project and potential reporting requirements.

  • Allowing modeling and sampling for GGRF assisted properties in areas of the country where whole building data is not made available by local utility providers. The inconsistency of utility data access was recently highlighted in a letter sent from EPA, U.S. Department of Housing and Urban Development, and U.S. Department of Energy to state and local utility commissioners asking for them to ensure this data is available for energy efficiency improvements of multifamily properties.

  • Continuing engagement through the award term, providing recipients flexibility for compliance requirements that are impeding effective delivery of GGRF resources. This could include modifying such requirements during the compliance period and other changes based on awardee and EPA experience.

  • Creating exemptions from Davis-Bacon requirements for construction projects under a certain dollar threshold, or when the GGRF funds are a comparatively small portion of the total project costs, to ensure these requirements don’t preclude using these resources for eligible projects. In addition, it is important that such requirements generally be applied only to large scale development projects, rather than to working capital to small businesses, market building, predevelopment, and other non-construction activities.

  • Advising against real-time reporting, too much effort and expense with not enough value.

  • Recommending being thoughtful and thorough about definitions and evaluating whether what is asked for is actionable, especially at the frequency of reporting. Don’t expect much change quarter to quarter.

  • Recommending replacing quarterly progress reports with oral consultations or debriefs, exchanging ideas/best practices on transforming the market.

  • Meeting as a cohort to create one reporting platform that can optimize reporting, increase effectiveness and create alignment.

Organization #4

Through public comment and in discussion as part of burden, Organization #4 recommended:


  • Streamlining and reducing reporting burden for mission-driven community leaders.

  • Reducing or limiting the amount of benefits data – particularly climate and air pollution –collected from subrecipients to drive down set-up and ongoing reporting costs.

  • Scaling the level of reporting detail required on a project to the scale and size of the project. Under CCIA, community lenders will fund a high volume of impactful small-scale projects. It is simply not practical to require the same level of reporting on a $30,000 residential solar installation as a multi-million-dollar large renewable energy development.

    • For some categories of projects that are smaller, simpler, and more uniform, EPA could consider allowing lenders to report on key project attributes and rely on modeled or derived savings or GHG reductions. This approach can be validated as needed by using statistical sampling methods.

  • Deciding on a standard methodology for calculating emissions reductions. This will reduce burden, create consistency, and support the program objectives.

  • Taking advantage of industry experience in energy savings and carbon emissions reporting, as well as existing data, analysis, and predictive tools to help lenders report on emissions, energy and climate-related metrics.

    • Considering Energy Star Portfolio Manager as solution for data collection and analysis for many building projects. Portfolio Manager is an easy, streamlined way of measuring savings from GGRF investments; it is regularly updated by EPA, and it is weather normalized.

    • Considering RMIs Green Upgrade Calculator https://greenup.rmi.org/ for project categories like green homes and clean transportation.

    • Considering new data collection and analysis tools such as advanced smart meters and building “digitization” that could simplify the process for many projects. Allowing provisions for incorporating new, efficient tools that deliver equivalent results.

    • Consider using sampling methodologies by ACEEE and Bright Power for evaluating Bank of America’s EE Financing program for CDFIs. (https://www.aceee.org/sites/default/files/publications/researchreports/f1601.pdf). Utilities and their regulators have used evaluation, measurement, and verification methods to gauge energy efficiency program effectiveness for years.

  • Releasing detailed guidance describing the parameters and final reporting requirements as soon as possible to allow for time-intensive IT and supporting implementation to occur.

  • Setting thresholds for Davis-Bacon compliance reporting, like U.S Treasury’s approach for ARPA funds (e.g., State & Local Fiscal Recovery Fund).

CCIA program’s robust reporting must not become a barrier that prevents small mission-driven community lenders from accessing the program or dissuades lenders from investing in incredibly impactful but small-scale projects.

Organization #5

Through written feedback Organization #5 recommended:


  • Reporting on aggregated consumer loan data at the census tract level. Depositories, such as credit unions, will find it challenging to report the detailed data in the data dictionary. reporting within 90 days of the close of reporting period.

  • Allowing 90 days for Transaction and Project System reporting.

  • Establishing a common methodology for cross-walking the lending outputs from subrecipient community lenders to a common pre-approved methodology for benefits calculations that recipients prepare based on reporting they receive from sub-recipients.


To expect each community lender to engage in their own benefits calculations will increase burdens on the lender and will introduce a high level of risk to the overall program and to all recipients. There would be variation in assumptions and methodology, thus opening the door to unnecessary risks to the integrity of the data and the program.


Organization #6

Through public comment and in discussion as part of burden, Organization #6 recommended:


  • Standardizing metrics and reporting systems for sub-recipients of the CCIA program: investee data, community benefits, and emissions reductions. These data points are not commonly reporting across the CDFI industry.

  • Clarifying whether reporting time frames are based on the calendar or fiscal year. Differences in recipient and sub-recipient fiscal year end will increase the cost of reporting. Audits are completed - generally 180 days from the Year End date according to most organization bylaws. This discrepancy in Year End dates would make it difficult for some organizations to be able to provide timely reports that comply with the EPA timeline.

  • Extending the reporting timeline to sixty days after quarter and annual end. The chains of data from sub-recipients to recipients are extended, unlike other EPA grant programs, and will require additional time and quality checks to ensure the data integrity and compliance with EPA Order 1000.33: U.S. EPA Policy for Evaluations and Other Evidence-Building Activities.

  • Considering a longer period of 120 days for first-year reporting to give time to develop reporting technologies and train all recipients and sub-recipients on the reporting tools and systems.

  • Providing additional detail regarding the definition of a ‘project’. In other words, at what stage of project development should this be reported, what level of detail will be required, and what metric(s) are desirable in terms of project location (e.g. zip code, latitude/longitude, other)?

  • Delaying the requirement of Scope 1 and 2 emissions until the second reporting year.

  • Including program income in annual reports rather than quarterly. For most Native CDFIs, this metric will not be available until at least year two of the program.

  • Developing a software platform to enable uniform reporting for recipients and sub-recipients and requiring a standardized reporting method.


The cost burden analysis completed by EPA vastly underestimates the time and hourly labor costs that it will take to adequately prepare responses. Organization #6 estimates that its burden will be significantly higher than EPA’s estimate because each recipient will be required to collect data from multiple sub-recipients prior to compiling their own reports. Without a standardized platform or automated data collection, Organization #6 estimates that the hourly time burden per response will be more than double EPA’s initial estimates. In addition, EPA’s estimate does not reflect initial technology support costs.

Organization #7

Through public comment and in discussion as part of burden, Organization #7 recommended:

  • Considering Organization #7’s definitions of “project” and “community.” Project is the physical facility that produces, stores, and provides energy as needed. Community metrics describe a person’s experience with the projects and programs.

  • Automating data reporting. Organization #7’s proprietary platform is a software tool built to automate community solar data and subscription management processes, allowing the organization to meet reporting needs more efficiently and effectively. The platform reduced staffing needs by 1.33 FTEs in some cases.

Organization #8

Through public comment, Organization #8 recommended:


  • Developing a metrics reporting and evaluation plan, including a data dictionary. Be specific about the assumptions (e.g., grid-factors, low-income thresholds), calculations, datasets and definitions so that each recipient does not spend time and resources to clarify their individual approach. Many sub-recipients (who may be small businesses) will not have the time or resources to do so.

  • Developing definitions and standards for investment types, technologies, energy off-takers (or savings), guarantees, impacts, etc.

Organization #9

Through public comment, Organization #9 recommended:


  • Compelling utility companies to provide easy access to data and reduce cost of benchmarking.

  • Requiring all building retrofit or highly efficient new building construction projects that receive GGRF funding to perform annual energy benchmarking.

  • Budgeting for the cost of energy benchmarking.

Organization #10

Through public comment, Organization #10 recommended:


  • Allowing time to build reporting capacity.

  • Developing clear guidelines on use of funds for capacity building.

  • Clearly defining environmental and community-based impact metrics and developing guidelines on how to measure.

  • Leveraging existing reporting structures and templates for collecting data.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSohn, Molly
File Modified0000-00-00
File Created2024-09-06

© 2025 OMB.report | Privacy Policy