EECBG_ICR_Supporting_Statement_A- REVISED

EECBG_ICR_Supporting_Statement_A- REVISED.docx

Energy Efficiency and Conservation Block Grants Evaluation Survey

OMB: 1910-5172

Document [docx]
Download: docx | pdf

United States Department of Energy

Supporting Statement

Energy Efficiency and Conservation Block Grant Program Evaluation

OMB Control Number: 1910-New


A. Justification


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the information collection.


The Energy Efficiency and Conservation Block Grant (EECBG) Program, authorized in Title V, Subtitle E of the Energy Independence and Security Act (EISA) and signed into law on December 19, 2007, was funded for the first time by the American Recovery and Reinvestment Act of 2009 (ARRA). The Funding Opportunity Announcement (FOA) for Formula Grants was issued on June 25, 2009 and closed on June 25, 2010. Over $2.7 billion was distributed through Formula Grants to about 2,350 cities, counties, states, territories and Native American tribes. This funding represents a Department of Energy priority to increase energy efficiency activities and renewable energy installations across the country while decreasing overall energy use and associated greenhouse gas emissions, increasing jobs and stimulating the economy.

The Program was designed to enable grant recipients to create and implement strategies to:

  • Reduce fossil fuel emissions

  • Reduce total energy use

  • Improve energy efficiency in the building and transportation sectors.


Recipients were encouraged and given the flexibility to develop new and innovative approaches across these three focus areas that would yield long-term sustainable impacts. Grants could be used in any of 14 eligible Activity areas referred to in this document (also known as Broad Program Areas, or BPAs). All funds were required to be committed within 18 months of award and fully expended within 36 months.

The EECBG evaluation presents a complex challenge. Evaluators must understand the overall objectives of the EECBG Program, the variations on the objectives present within each grant, (and in the case of State grants, their sub-grants), and the variety of unique projects (referred to as “Activities”) carried out under a grant. Much of the funding is directed to projects resulting in direct energy impacts. Other components are structured to achieve market development and transformation goals, and still others provide a platform to increase overall awareness and aid in state and local long-term planning efforts.

The evaluation of the EECBG Program is intended to “document the Program’s principal achievements and provide valuable information for policy makers and program managers to help inform future energy efficiency and renewable energy efforts”.1 This will require a combination of qualitative and quantitative approaches designed to effectively communicate both the direct energy impacts and the features that enabled success for grantees.

Statutory Authority – Title V, Subtitle E of the Energy Independence and Security Act of 2007, (EISA) authorizes DOE to establish and administer the Energy Efficiency and Conservation Block Grant (EECBG) Program. Section 547 (Review and Evaluation) authorizes the Secretary to review and evaluate the performance of entities receiving grants under the Program.


The Grant Activity Manager Survey is the primary data collection tool being used to confirm project information from DOE’s Performance and Accountability for Grants (PAGE) reporting system and to gather more detailed information not available from other sources. Extensive review of the PAGE data shows significant gaps and/or lack of detail in the information necessary for calculating program impacts. These data consist of information regarding the specific energy saving activities undertaken by grantees using EECBG grant funds. The survey is being administered to those individuals who have the most knowledge of the work done in buildings or communities that may result in energy savings. This information is needed in order for the evaluation team to calculate the impacts of the EECBG grant in terms of energy savings and other key metrics. Sufficiently-detailed information is required for the use of industry-standard energy calculations formulae that have incorporated into a tool specifically customized for this project.


  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


Employing data collected from existing EECBG databases and in-depth interviews with DOE project officers, grantees, and other primary stakeholders, the study team will answer the three key research questions of this evaluation:


  1. What is the total lifetime magnitude of energy and cost savings and other key outcomes achieved in those BPAs that cumulatively account for approximately 80% of total Formula Grant expenditures in the 2009-2011 program years?

  2. What is the lifetime magnitude of outcomes achieved by each of the most heavily-funded BPAs within the EECBG portfolio?

  3. What are the key performance factors influencing the magnitude of EECBG outcomes?

These questions will be answered based on evaluating a sample of 350 grants/activities from a pool of 2,338 direct grants and over 5,000 sub-grants. The following six BPAs account for approximately 80% of grant expenditures out of the fourteen BPAs:

  • Energy Efficiency Retrofits

  • Financial Incentives

  • Buildings and Facilities

  • Onsite Renewables

  • Lighting

  • Energy Efficiency and Conservation Strategy



Across these six BPAs, the evaluation will assess the following metrics:

  • Energy savings – Energy savings will be determined based on engineering estimates applied to data reported in PAGE augmented by survey data to verify and/or collect information regarding the number and types of measures implemented, or other behaviors changed.

  • Reductions in energy costs – The energy costs will be calculated by applying regional energy and demand costs to the project savings in the corresponding geographical regions.

  • Net job creation and productivity impacts – A combination of PAGE-reported data and survey responses updating the jobs created and reduced post-project will be used as inputs to an economic model (REMI) to calculate net job creation and productivity impacts.

  • Impact on air quality and carbon emissions – The results of the energy savings analysis will serve as inputs to calculations of air quality and carbon emissions impacts, taking into account the regional greenhouse gases that would be displaced from the lower energy production required.

  • Use of federal, state and local government resources, private sector investment and non-profit organizations’ services to increase program benefit – Questions in the survey will determine the extent to which alternative funding sources were leverages as part of implementing the grant project.

The information that will be gathered and analyzed through this collection will have multiple audiences. It will be used to inform Congress, the Department, and the Administration of the current state of program performance. Statistics will be used to update and improve Program Assessment Rating Tool (PART) and Government Performance and Results Act (GPRA) assessments. Results of the study will also be used to identify strengths and weaknesses of program performance in order to target ways in which this can be improved at federal, state, and local implementation levels.


For more information on the study design, please refer to Attachment A “EECBG_Detailed_Evaluation_Plan.”


  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses.



Approximately 70% of data collection will involve the use of CATI technology, and the remainder will rely on engineering desk reviews and conventional recordkeeping. Computer-Aided Telephone Interviewing (CATI) is a data collection tool that provides many advantages over standard paper surveys conducted over the phone. It creates a project database in real time, reducing data entry error from transferring hard-copy survey data into an electronic form, and enables skip patterns to be more efficiently executed as the survey is administered. CATI programming of the survey instrument can be coordinated with other electronic formats, such as web-based surveys, so that data can be easily merged and analyzed. In order to increase response rates, a recruiter will call each respondent to schedule a specific date and time to conduct the interview.


Approximately thirty percent of the data (not the respondents) is expected to come from secondary sources (e.g., PAGE and other sources) collected by the research team via file reviews and internet research; whereas the remaining 70% will come from the primary data to be collected via the surveys. Therefore, 100% of the sample of 350 grant recipients will have the survey administered via Computer Aided Telephone Interviewing (CATI).


  1. Describe efforts to identify duplication.


The EECBG survey is aimed at confirming and filling gaps in PAGE-reported data from the existing reporting protocols, and (more importantly) obtaining additional detail on project characteristics required to carry out the evaluation’s energy calculations. The national EECBG evaluation is carefully designed to eliminate the possibility of duplication of efforts between evaluations implemented by the states and the national EECBG evaluation. The national EECBG evaluation is employing the following steps to ensure that duplication of evaluation efforts does not occur.


  1. Once the final sample is determined, the evaluation team will coordinate with regional DOE project officers to identify state evaluation efforts.

  2. Upon OMB approval of the EECBG ICR, the evaluation team will communicate with selected state program managers to identify programs and activities that are already being evaluated.

  3. Once the final sample is determined, the evaluation team will coordinate with evaluation contractors to learn of state efforts with which they are involved.

  4. The EECBG evaluation team will coordinate with the Better Buildings Neighborhood Program regarding any data collection addressing EECBG activities to ensure, to the extent possible, that a single respondent does not receive more than one data request.

The above efforts will keep the national EECBG evaluation informed of what states are doing so that the programs included in the national EECBG evaluation do not overlap with the state studies. Any activity that is already being evaluated will be replaced with a substitute activity from the oversample.

Steps are also being taken to avoid duplication with DOE evaluations of the State Energy Program (SEP) and low-income Weatherization Assistance Program (WAP). Those programs are separate and distinct from EECBG but fund some similar energy efficiency and renewable energy activities. In no case will the same programmatic activity be selected for study in more than a single evaluation. Additionally, the EECBG and SEP teams have taken steps to identify potential areas of overlap in the contact people to be surveyed for the two evaluations. In most cases, it appears that different individuals handle EECBG, SEP and WAP within the State energy organizations. However, there could be a few cases where the same individual has to be interviewed for EECBG and SEP evaluations. It is also possible, but even less likely, that there could be some overlap of respondents between the EECBG and WAP evaluations. It is important to note that the specific information being sought differs among the three studies, as does the timing of the data collection efforts. In sum, the national EECBG evaluation employs coordination, sampling and study approach designs that ensure non-duplicative evaluation efforts.


  1. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


The national EECBG evaluation will seek to minimize burden on small businesses and other small entities. To accomplish this, the evaluation has kept the sample size as low as possible; will collect information from the DOE Project Officers to the extent feasible in order to reduce burden on non-federal organizations; and will limit the information sought from small entities to the minimum necessary. When designing surveys, the project team diligently streamlined data collection instruments, implemented skip patterns, and maximized user-friendliness in order to minimize demands on respondents, including small entities.



  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


In the absence of a national evaluation of EECBG, DOE would not have the quantitative information needed to accurately document key outcomes by program area and determine program effectiveness. Also, in the absence of this evaluation, policy makers and program managers would lack essential information needed to make informed program design and resource allocation decisions should the program continue in future years. Similarly, individual jurisdictions would lack the information needed to select those energy efficiency and renewable energy programmatic activities that best meet their specific needs. However, since there are no current plans to continue the EECBG program, this is a one-time evaluation attempting to capture impacts and lessons learned regarding local government EE projects, and as such it is not anticipated to be repeated.



  1. Explain any special circumstances that would require the collection to be conducted in a manner inconsistent with OMB guidelines: (a) requiring respondents to report information to the agency more often than quarterly; (b) requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it: (c) requiring respondents to submit more than an original and two copies of any document; (d) requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years; (e) in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study; (f) requiring the use of statistical data classification that has not been reviewed and approved by OMB; (g) that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; (h) requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.

There are none. The package is consistent with OMB guidelines.


  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5CFR 320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken in response to the comments. Specifically address comments received on cost and hour burden. Describe efforts to consult with persons outside DOE to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or report.


The 60 Day Notice was published in the Federal Register, Vol. 77, No. 31, pp. 8852 – 8853, on February 15, 2012. No comments were received in response. The 30 Day Notice was published in the Federal Register, Vol. 77, No. 131, pp.40345-40347, on July 9, 2012. One anonymous comment was received. The commenter opposed this information collection request based on the incorrect assumption that regular reporting for EECBG grantees has not been required by DOE and that this information collection request would not be necessary if regular reporting had been required. In fact, EECBG grantees are required by DOE to provide quarterly reports on their expenditures and activities. Information from those reports is being used in the national evaluation of EECBG. However, additional information is required to conduct a complete evaluation. That additional information is the subject of this information collection request. No response was given to the comment.


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Not applicable as no payment or gift is being proposed for any of the information collections covered in this request.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.


The general terms and conditions governing the Oak Ridge National Laboratory’s subcontract with the evaluation contractor incorporate by reference FAR 52.224-1 Privacy Act Notification (Apr 1984) and FAR 52.224-2 Privacy Act (Apr 1984).


The information provided by respondents to the surveys and data requests will be reported only in the aggregate and a subject’s name, agency, or other identifying information will not be reported in association with the individual answers. That information will likewise not be delivered by the evaluation contractor to Oak Ridge National Laboratory or DOE. Furthermore, any information presented to the public will be in the aggregate to prevent disclosure of personally identifiable information.


Names, addresses, and phone numbers of service recipients will be gathered from Program records and stored as part of this study. That information is not defined as protected Personally Identifiable Information (PII) because it is available from public sources. Nonetheless, those data and all other information collected during the course of this evaluation will be subject to the protocols the evaluation contractor uses for the protection of confidential information. To the extent feasible, those protocols are consistent with guidelines from the ISO 27001 code of practices and include restricted file access and the use of encryption software for portable devices containing confidential information (although any placement of study data on such devices would be limited, temporary, and task-specific). Any breach would be the responsibility of the evaluation contractor in accordance with its subcontract with ORNL and would be addressed as specified in its Incident Response Policy.


A Privacy Impact Assessment has been submitted to the Privacy Act Officer at DOE’s Oak Ridge Operations Office (ORO) explaining the nature of the information to be gathered and stored. Should any further action be required by the Privacy Act Officer, it will be taken as soon as those instructions are received.


The introduction to each data collection instrument contains Privacy Act language (see below) informing prospective respondents of the statutory authority for the collection, the purpose for which the information will be used, the voluntary nature of their participation, and the lack of adverse effects should they choose not to provide any or all of the requested information. The introduction further explains that the sole use of the information collected will be for an analysis of national-level Program impacts.


Privacy Act Language for Each Data Collection Instrument


The U.S. Department of Energy (DOE) would like to inform each individual that the information requested here is being solicited under the statutory authority of Title V Subtitle E of the Energy Independence and Security Act of 2007, which authorizes DOE to establish and administer the Energy Efficiency and Conservation Block Grant(EECBG) Program. This information is being sought as part of a national evaluation of EECBG, the purpose of which is to reliably quantify Program accomplishments and help inform decisions on future operations. The sole use of the information collected will be for an analysis of national-level Program impacts. Disclosure of this information is voluntary and there will be no adverse effects associated with not providing all or any part of the requested information.


  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why DOE considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


No information of a sensitive nature is being collected.


  1. Provide estimates of the hour burden of the collection of information. The statement should indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, DOE should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample fewer than 10 potential respondents is desirable.


The Grant Activity Manager Survey will be administered one time to 350 respondents as a Computer Assisted Telephone Interview (CATI). While the instrument is designed to accommodate information collection for a broad range of programs and initiatives that span 350 sample points, individual respondents will only be asked a subset of questions that directly apply to their specific sampled Activity. The survey is designed in a modular fashion and will incorporate programmed skip logic to accommodate this strategy.


Time testing of the survey instrument was done by a group of six analysts on the evaluation team assuming a CATI delivery method. The analysts were paired with one acting as the Grant Manager and the other administering the survey (face-to-face). The analysts acting as Grant Managers were each given a set of 3-5 actual grant projects to study using the PAGE data and any other descriptive information that could be found on a selected project. The other member of each pair administered the survey as if it was a phone survey. It is highly unlikely that any one respondent would ever be required to answer every question in the survey instrument. A typical grant project is for either a residential or non-residential building (resulting in skipping almost half of the survey), and for a subset of measures (e.g., a lighting retrofit, resulting in skipping several other sets of questions). Some grants are more complex and involve multiple buildings and measures, whereas others are less so. The average estimated burden for survey respondents assumes that the typical respondent will answer approximately 20% to 30% of the questions.


The survey will be administered in two parts, consisting of a combination of CATI and web-based survey modes. The first part of the survey, containing introductory and confirmatory questions, will be administered via a telephone interview, estimated to average 20 minutes. The respondent will then be directed to a website where the remainder of the survey, involving more complex questions about actual measures installed, can be completed on-line over time. Once the respondent has completed the web-based survey, he or she will submit it electronically. As shown in Table 1, the average burden per respondent is estimated to be 200 minutes and the total number of burden hours for all 350 respondents is 1,167 hours. This includes all time required for the respondents to read instructions, review their records, gather the necessary information, complete the survey, review their responses, and provide any necessary clarification of their answers.


Table 1: Burden Calculation

Grant Activity Manager Survey

 

Respondents
(n)

Survey Length
(mins)

Burden
(hours)

350

80

467




General Recordkeeping Data Gathering

 

Respondents
(n)

Record Review
(mins)

Burden
(hours)

350

120

700




TOTAL BURDEN

(hours)

1,167






  1. Provide an estimate for the total annual cost burden to respondents or record keepers resulting from the collection of information.


The estimated cost burden to respondents is $41,347. This was calculated by identifying a labor category and hourly rate for the survey respondents from the Bureau of Labor Statistics, May 2011 National Occupational Employment and Wage Estimates. The labor category that most appropriately represents the respondent pool is 11-9151 “Social and Community Service Managers.” Multiplying this hourly rate ($35.43) by the number of burden hours (1,167) arrives at the estimated cost burden of $41,347.



  1. Provide estimates of annualized cost to the Federal government.


The total cost of this evaluation is $3.0 million over a two year period. This includes $2.2 million for an independent evaluation subcontractor to develop a detailed evaluation plan and collect and analyze the necessary data. The remainder of the $3 million covers the cost of developing an initial scope of work, issuing a competitive solicitation to select the independent evaluation team, conducting peer reviews, managing the study, and reviewing the products of this evaluation effort. The average annual cost is $1.5 million.


  1. Explain the reasons for any program changes or adjustments reported in Items 13 (or 14) of OMB Form 83-I.


This is a new information collection.


  1. For collections whose results will be published, outline the plans for tabulation and publication.


During the EECBG project, the evaluation team will prepare memos summarizing interim findings. Specifically, summary memos will be prepared upon completion of the following key tasks:

  • Sample Design and Selection

  • Energy Savings Analysis

Following the receipt of comments on those interim documents, they will be revised as needed.

Upon completion of the study, a draft report will be reviewed by an independent panel of evaluation experts and key stakeholders. Comments and direction received during the review process will be incorporated into the final written report. The specific method of publication of the final report has not yet been determined. However, any publication or other dissemination of information (i.e., website, newsletter or presentation at a meeting or conference) will report in the aggregate to prevent disclosure of PII.


  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons why display would be inappropriate.


Approval not to display the OMB expiration date is not being sought.


  1. Explain each exception to the certification statement identified in Item 19 of OMB Form 83-I.


No exceptions to the certification statement are being sought. The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-1.

1 As stated in the April 2011 EECBG Evaluation Plan original solicitation documents.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJoel F Eisenberg
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy