Supporting Statement Part A FINAL

Supporting Statement Part A FINAL.docx

State Energy Program Evaluation

OMB: 1910-5170

Document [docx]
Download: docx | pdf

Supporting Statement

State Energy Program Evaluation

OMB Control Number: XXXX-XXXX


A. Justification


  1. Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the information collection.


The State Energy Program (SEP) is a national program operated by the U.S. Department of Energy (DOE). DOE’s office of Weatherization and Intergovernmental Program (OWIP), which manages SEP, has commissioned an evaluation to develop an independent estimate of key program outcomes.


SEP provides grants and technical support to the states and U.S. territories, enabling them to carry out cost-shared energy efficiency and renewable energy activities that meet each state’s unique energy needs while also addressing national goals, such as energy security. SEP was created by Congress in 1996 by consolidating the State Energy Conservation Program and the Institutional Conservation Program, which were both established in 1975.


In 2009, SEP received $3.1 billion of the American Reinvestment and Recovery Act (ARRA) funds that were allocated to DOE. The ARRA allocation represents a substantial increase in funding from the 2007 and 2008 Program Years, when funding levels were $69 million and $62 million, respectively.


Once the ARRA funding has been expended (by April 2012), the SEP allocation is expected to return to levels typical of the pre-ARRA period. OWIP plans to assess the outcomes of programmatic activity for one pre-ARRA year (Program Year 2008), as well as for the ARRA period (Program Years 2009 – 2011). Because of the difference in the level of funding and activities, the two evaluation periods will be treated as separate programs for purposes of sampling state-level activities and estimating national impacts.


A full-scale evaluation of SEP has not been conducted previously. This evaluation is needed to ensure accountability, by reliably quantifying what has been accomplished with the federal investment in SEP. In addition, the evaluation is needed to provide guidance to DOE and the State Energy Offices on how to refine the program in future years and to identify the highest impact activities for States to engage in.


Statutory Authority – Title III of the Energy Policy and Conservation Act of 1975, (42 U.S.C. 6321 et seq.) as amended, authorizes DOE to administer the State Energy Program (SEP).


  1. Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.


The information collected will be used by DOE to reliably quantify what SEP has accomplished nationwide, in terms of the key outcome measures identified for the program:

  • Reduction in energy use – how much energy and energy costs does the program save, by fuel type?1

  • Production of energy from renewable sources – how much has the program increased renewable energy capacity and generation?

  • Reduction in carbon emissions

  • Generation of jobs, including direct and indirect jobs attributable to SEP2

The overall objective of this evaluation is to develop independent quantitative estimates of key program outcomes at the following levels of aggregation:

  • The most heavily-funded SEP broad program area categories (BPACs) for PY 2008

  • The most heavily-funded BPACs for the ARRA period

  • Sum of all studied BPACs for PY 2008

  • Sum of all studied BPACs for the ARRA period

The information that will be gathered and analyzed through this collection will have multiple audiences. It will be used to inform Congress, the Department, and the Administration of the current state of program performance. Statistics will be used to update and improve Program Assessment Rating Tool (PART) and Government Performance and Results Act (GPRA) assessments. Results of the study will also be used to identify strengths and weaknesses of program performance in order to target ways in which this can be improved at federal, state, and local implementation levels.


For more information on the study design, please refer to Attachment A “SEP_Detailed_Evaluation_Plan.”



  1. Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses.



Approximately 50% of data collection will involve the use of computer aided telephone interviewing (CATI) technology, and the remainder will rely on in-depth interviews, on-site protocols, and conventional recordkeeping. A list of surveys by category can be found in Attachment B “SEP_Evaluation_Summary_of_Surveys.”


For a subset of CATI surveys, the evaluation team will use Voxco, a multi-mode survey software package that allows us to easily conduct and integrate CATI and Computer Assisted Web Interviews (CAWI) on the same project. The software allows us to contact the same respondent by email if available, phone and again by email if necessary. All of these contact attempts are tracked and are used to calculate the final response rate.


For quantitative surveys, we assume that email addresses will not be available for our planned research populations. Consequently, there will be no broadcast emails to research populations and respondents will be informed through the introductory language and screener scripts of each of the surveys. If we discover that email addresses are available for the respondent population after contacting them by phone, those email addresses will be used for any follow up questions or to complete surveys if not completed the first time. If we discover that email addresses are available prior to beginning the survey, we will program the CATI instrument into a CAWI format as another option to increase response rates for respondents who prefer to participate by internet or who will not otherwise participate by telephone. This web-based approach will also reduce respondent burden and increase accuracy of data collection by eliminating the telephone interviewer for those respondents who choose to respond through the web-based option.


  1. Describe efforts to identify duplication.


The national SEP evaluation is specifically designed to eliminate the possibility of duplication of efforts between evaluations implemented by the states and the national SEP evaluation. The national SEP evaluation is employing the following steps to ensure that duplication of evaluation efforts does not occur.


  1. The national SEP evaluation effort coordinates with the National Association of State Energy Officials (NASEO) to share information on what studies the states are doing that may be targeting the same broad program areas targeted by the national study so that the national study does not select the same programmatic activities for examination. DOE holds “office hours” twice yearly at NASEO meetings.

  2. The national SEP evaluation is coordinating with regional DOE project officers to identify any state evaluation efforts with which they are associated and eliminate sample overlap.

  3. The national SEP evaluation regularly meets with selected state program managers (Network Steering Committee) to understand what evaluation efforts they are undertaking and the research approaches that are being employed to identify programs that are already being evaluated.

  4. The national SEP evaluation is coordinating with evaluation contractors to learn of state evaluation efforts with which they are involved. For example, the study has already coordinated with the NY, CO and CA evaluation contractors conducting studies for those states so that duplication of evaluation efforts does not occur.

The above efforts will keep the national SEP evaluation informed of what states are doing so that the programs sampled for the national SEP evaluation do not overlap with the state studies. These coordination efforts are ongoing and will continue for the remainder of this evaluation effort.


In addition to these efforts to avoid duplication, DOE has provided a set of evaluation guidelines to the states as part of its SEP ARRA-period reporting requirements. The purpose of those guidelines is to help inform the states’ evaluation efforts and ensure that the results of any independent state evaluations are reliable enough to allow them to be used to support the national SEP evaluation without the need to study the same activities again. Those guidelines are included as Attachment C: “SEP Evaluation Guidelines (Recovery Act).”

In summary, the national SEP evaluation employs coordination, sampling and study approach designs that ensure non-duplicative evaluation efforts.


  1. If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.


The national SEP evaluation will seek to minimize burden on small businesses and other small entities. To accomplish this, the evaluation has kept the sample size as low as possible; will collect information from the DOE Project Officers to the extent feasible in order to reduce burden on non-federal organizations; and will limit the information sought from small entities to the minimum necessary. When designing surveys, the project team sought to streamline data collection instruments so as to maximize user-friendliness and minimize demands on respondents, including small entities. Finally, when conducting residential interviews and site visits, evaluators will do their utmost to minimize time, resource, and information demands.


  1. Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.


Without the findings to be generated by the SEP National Evaluation, DOE policy makers and program managers would not have the quantitative information needed to accurately document key outcomes by program area and make informed program design and resource allocation decisions for future years. Similarly, the individual states and territories would lack the information needed to select those energy efficiency and renewable energy programmatic activities that best meet their state-specific needs.


  1. Explain any special circumstances that require the collection to be conducted in a manner inconsistent with OMB guidelines: (a) requiring respondents to report information to the agency more often than quarterly; (b) requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it; (c) requiring respondents to submit more than an original and two copies of any document; (d) requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years; (e) in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study; (f) requiring the use of statistical data classification that has not been reviewed and approved by OMB; (g) that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; (h) requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.

There are none. The package is consistent with OMB guidelines.



  1. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency’s notice, required by 5CFR 320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken in response to the comments. Specifically address comments received on cost and hour burden. Describe efforts to consult with persons outside DOE to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or report.


The 60 Day Notice was published in the Federal Register, Vol. 76, No. 130, pp. 39860 – 39862, on July 7, 2011. No comments were received from the public. Both the 60 Day and 30 Day Federal Register notices are attached to this submission as Attachment D “SEP_FRN_60,” and Attachment E “SEP_FRN_30.”


  1. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.


Not applicable as no payment or gift is being proposed for any of the information collections covered in this request.


  1. Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.



The information provided by respondents to the surveys and data requests will be reported only in the aggregate and a subject’s name, agency, or other identifying information will not be reported in association with the individual answers. That information will likewise not be delivered by the evaluation contractor to Oak Ridge National Laboratory or DOE.


Names, addresses, and phone numbers of service recipients will be gathered from Program records and stored as part of this study. That information is not defined as protected Personally Identifiable Information (PII) because it is available from public sources. Nonetheless those data and all other information collected during the course of this evaluation will be subject to the protocols the evaluation contractor uses for the protection of confidential information. To the extent feasible, those protocols are consistent with guidelines from the ISO 27001 code of practices and include restricted file access and the use of encryption software for portable devices containing confidential information (although any placement of study data on such devices would be limited, temporary, and task-specific). Any breach would be the responsibility of the evaluation contractor in accordance with its subcontract with ORNL and would be addressed as specified in its Incident Response Policy.


A Privacy Impact Assessment will be submitted to the Privacy Act Officer at DOE’s Oak Ridge Operations Office (ORO) explaining the nature of the information to be gathered and stored. Should any further action be required by the Privacy Act Officer, it will be taken as soon as those instructions are received.


The introduction to each data collection instrument contains Privacy Act language (see below) informing prospective respondents of the statutory authority for the collection, the purpose for which the information will be used, the voluntary nature of their participation, and the lack of adverse effects should they choose not to provide any or all of the requested information. The introduction further explains that the sole use of the information collected will be for an analysis of national-level Program impacts.


Privacy Act Language for Each Data Collection Instrument


The U.S. Department of Energy (DOE) would like to inform each individual that the information requested here is being solicited under the statutory authority of Title III of the Energy Policy and Conservation Act of 1975, as amended, which authorizes DOE to administer the State Energy Program (SEP). This information is being sought as part of a national evaluation of SEP, the purpose of which is to reliably quantify Program accomplishments and help inform decisions on future operations. The sole use of the information collected will be for an analysis of national-level Program impacts. Disclosure of this information is voluntary and there will be no adverse effects associated with not providing all or any part of the requested information.




  1. Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why DOE considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.


No information of a sensitive nature is being collected.


  1. Provide estimates of the hour burden of the collection of information. The statement should indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, DOE should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample fewer than 10 potential respondents is desirable.


An attachment shows the burden estimate for each element of the data collection effort as well as the total burden. Seventeen in-depth interview guides, seven CATI surveys, and two on-site protocols will be administered to representatives of various populations (project managers, participants, vendors, etc.), for a total of 26 separate data collection instruments. All 26 instruments are attached to this submission for OMB review, as is a summary table identifying key features of each instrument including survey name, sampling frame, sample size, survey objective, and utility of data collected (Attachment B “SEP_Evaluation_Summary_of_Surveys.” An additional summary table that identifies the BPAC/Subcategories addressed by each survey, and the respective study period(s) covered, is provided at Attachment F “_SEP_Survey_BPAC_Period_Table.”


Data will be collected from a total of 5,635 respondents, 150 of whom will be subject to two separate information collections. One hundred commercial participants (respondents to the Commercial Participants Survey) and 50 residential participants (respondents to the Residential Participants Survey) will be randomly selected as a sample nested within each of those respondent populations. This information collection request does not entail any surveys of any households that would take place during the Census embargo of such activities.


The data collection effort will span less than one year. The revised burden estimate is 7,735 hours, composed of 6,663 data collection hours and 1,072 recordkeeping hours. The revision is based on assuming the maximum pre-tested duration for all of the survey instruments. Hour burden estimates were prepared separately for each individual data collection effort based on the nature of the information requested, the instrument used, experience with similar data collection efforts, and feedback provided by survey pre-testers.


The estimate of hours burden of the information collection is as follows:

Total number of unduplicated respondents: 5,635 (PY2008 – 3,662; ARRA – 1,973)

Reports filed per person: 1.03 (PY2008 – 1.03; ARRA – 1.03)

Total annual responses: 5,785 (PY2008 – 3,760; ARRA – 2,025)

Total annual burden hours: 7,735 (PY2008 – 5,044; ARRA – 2,691)


Average burden Per collection: 1.34 hour (PY2008 – 1.34; ARRA – 1.33)

Per applicant: 1.37 hour (PY2008 – 1.38; ARRA – 1.36)


The estimated cost burden to respondents is $157,768. This was calculated by identifying an appropriate labor category and hourly rate for the respondents to each survey and multiplying that by the number of burden hours. Labor categories and their associated wage rates were obtain from the Bureau of Labor Statistics, May 2011 National Occupational Employment and Wage Estimates. Attachment G titled “SEP_Eval_Burden_Cost_Estimate” shows how the costs for each instrument and total costs for all instruments combined were calculated from burden hours and hourly costs.


  1. Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting from the collection of information.


Data collection for this evaluation will entail database review activities. The recordkeeping burden estimate is 1,072 hours based on an estimate of 16 hours of labor for 65 different PAs.



  1. Provide estimates of annualized cost to the Federal government.


The total cost of this evaluation is $12 million over a three year period. This includes $9.7 million for an independent evaluation subcontractor to develop a detailed evaluation plan and collect and analyze the necessary data. The remainder of the $12 million covers the cost of developing an initial scope of work, issuing a competitive solicitation to select the independent evaluation team, conducting peer reviews, managing the study, and reviewing the products of this evaluation effort. The average annual cost is $4 million.



  1. Explain the reasons for any program changes or adjustments reported in Items 13 (or 14) of OMB Form 83-I.


This is a new information collection; therefore there are no program changes or adjustments.


  1. For collections whose results will be published, outline the plans for tabulation and publication.


A comprehensive evaluation report to ORNL and DOE will be completed by December 2012.The timeline allows for a draft final evaluation report to be completed by the end of October 2012 with a reviewed report to follow by November 16, 2012. The final evaluation report will document procedures, analysis, results and interpretation of findings for the key outcome measures identified for the program. The report will receive peer review by the SEP advisory panel.


To provide ORNL and DOE with more timely feedback on the early results of the evaluation effort, the following interim reports will be prepared:


  • BPAC Reports – This interim series of reports will summarize evaluation results and findings for Renewables; Loans, Grants, and Incentives; Clean Energy Policy Support; Technical Assistance; Building Retrofits; and Codes and Standards (October 2012).

  • Labor Impact Analysis – This will report results and findings on direct and indirect job creation effects of SEP (October 2012). DOE has researched this issue with staff internally and with the National Renewable Energy Laboratory (NREL). There is no duplication of effort in this evaluation with other existing initiatives to estimate labor impacts or to develop job creation models.

  • Carbon Analysis – This will report findings on the carbon emissions impacts and mitigation benefits of SEP (October 2012). The benefits of carbon emission reductions will be calculated in a manner that is consistent with the approach described in the Technical Support Document on the Social Cost of Carbon for Regulatory Impact Analysis prepared by the Interagency Working Group on Social Cost of Carbon in February 2010 (http://www.whitehouse.gov/sites/default/files/omb/inforeg/for-agencies/Social-Cost-of-Carbon-for-RIA.pdf). A full description of the analyses to be performed for all reports is found in the Attachment A: SEP Detailed Evaluation Plan.


  1. If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons why display would be inappropriate.


Approval not to display the OMB expiration date is not being sought.


  1. Explain each exception to the certification statement identified in Item 19 of OMB Form 83-I.


No exceptions to the certification statement are being sought. The agency is able to certify compliance with all provisions under Item 19 of OMB Form 83-1.

1 Energy cost savings will be calculated from energy savings using data on average retail price by sector and state from EIA.

2 The proposed approach includes a 51-region (state) REMI Policy Insight simulation model. This analysis system has been successfully applied to numerous energy and environmental policy/program evaluations. This model is chosen over others since it has the relevant economic levers and feedbacks to handle the types of effects expected to flow from such project spending and energy saving (generating) technology adoption. The model is a computable, general equilibrium (CGE) simulation forecasting system of industry-level activity for 70 different industries (approximating three-digit NAICS definitions of business activity) through the year 2050. It is well-specified through its internal logic or equation set, such that feedbacks among economic stakeholders (households and businesses) are captured when more energy-efficiency and renewable generation investments take place. A multi-state model (of 51 regional economies) will exhibit feedback between other states (inter-regional) for labor flows (commuters) and trade in manufactured goods and in services. Unique to the REMI model among the class of competing regional economic impact frameworks available is the linkage to market shares. Policies or investments that change the underlying cost-of-doing business for an industry in a given region will affect that industry’s relative competitiveness (relative to the U.S. average for that industry) and its ability to retain/gain sales within its own region, elsewhere in the multi-region marketplace, elsewhere in the U.S. and for non-U.S. trade.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJoel F Eisenberg
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy