Download:
pdf |
pdfEvaluation Work Plan: FINAL
Energy Efficiency and Conservation Block Grant
February 9, 2012
Table of Contents
1.
EXECUTIVE SUMMARY ...........................................................................................1
1.1
1.2
1.3
1.4
2.
PROGRAM DESCRIPTION ..............................................................................................1
EVALUATION OBJECTIVES ............................................................................................2
EVALUATION APPROACH ..............................................................................................3
PROJECT TIMELINE ......................................................................................................8
INTRODUCTION .......................................................................................................9
2.1
2.2
2.3
2.4
3.
PROGRAM DESCRIPTION ..............................................................................................9
EVALUATION OBJECTIVES ..........................................................................................10
EVALUATION APPROACH ............................................................................................11
WORK PLAN STRUCTURE ...........................................................................................15
DATA COLLECTION ...............................................................................................16
3.1
3.2
REPORTING DATA AND ACTIVITY DOCUMENTATION ......................................................18
SURVEY INSTRUMENTS ..............................................................................................20
3.2.1 Objectives ..........................................................................................................20
3.2.2 Survey #1 – DOE Program Officers, Regional and State Coordinators ..............23
3.2.3 Survey #2 –– Local Grant Activity Manager Survey (GAMS) ..............................25
3.2.4 Survey #3 – Performance Indicators Survey ......................................................30
3.3
FOLLOW -UP INTERVIEWS WITH GRANTEE AND SUB-GRANTEE ACTIVITY PROJECT
MANAGERS ...............................................................................................................31
4.
PROGRAM DATA CHARACTERISTICS AND SAMPLE DESIGN ..........................32
4.1
4.2
4.3
4.4
4.5
5.
SAMPLING FRAME ......................................................................................................32
4.1.1 Structure and Sampling Units.............................................................................32
4.1.2 Frame Restrictions .............................................................................................35
SAMPLING IN W AVES..................................................................................................37
STRATIFICATION ........................................................................................................38
ALLOCATION..............................................................................................................40
OVERSAMPLING .........................................................................................................45
REVIEW OF EECBG DATABASES ........................................................................46
5.1
5.2
5.3
6.
CREATION OF REVIEW PACKAGES ..............................................................................46
REVIEW PROCEDURE .................................................................................................46
POST-SURVEY REVIEW ..............................................................................................48
SURVEY IMPLEMENTATION .................................................................................49
6.1
6.2
6.3
POPULATION OF SURVEYS WITH DATA FROM REVIEW OF EECBG DATABASES ..............49
SURVEY TRAINING .....................................................................................................49
CONDUCT SURVEY #1 – DOE PROGRAM OFFICERS INTERVIEWS..................................50
Oak Ridge National Laboratory
i
February 9, 2012
Table of Contents
6.4
6.5
6.6
6.7
7.
CONDUCT SURVEY # 2 – LOCAL GRANTEE GRANT ACTIVITY MANAGER SURVEY............50
CONDUCT SURVEY #3 – PERFORMANCE INDICATORS SURVEY .....................................50
FOLLOW -UP CALLS ....................................................................................................51
PREPARE SURVEY RESULTS DATABASE ......................................................................51
ESTIMATION OF ENERGY SAVINGS ....................................................................52
7.1
7.2
7.3
INTRODUCTION ..........................................................................................................52
GROUPINGS OF PROGRAMS FOR ENERGY IMPACT ASSESSMENT PLANNING ..................53
EVALUATION PLANS: BUILDING RETROFIT AND EQUIPMENT REPLACEMENT ...................55
7.3.1 Introduction ........................................................................................................55
7.3.2 Development of the Savings Calculation Tool (SCT) ..........................................57
7.3.3 Energy Impacts Assessment Approach ..............................................................59
7.4
ON-SITE RENEWABLE TECHNOLOGY PROGRAM ...........................................................63
7.4.1 Introduction ........................................................................................................63
7.4.2 Assessment of Evaluability ................................................................................63
7.4.3 Verification Data Collection and Analysis ...........................................................64
7.4.4 Expansion of Sample Savings Estimates to the Population of Activities .............64
7.5
ENERGY EFFICIENCY AND CONSERVATION STRATEGY .................................................64
7.5.1 Assessment of Evaluability ................................................................................64
7.5.2 Estimation of Energy Impacts.............................................................................65
7.5.3 Expansion of Sample Savings Estimates to the Population of Activities .............65
8.
ATTRIBUTION APPROACH ...................................................................................66
8.1
8.2
8.3
9.
SOURCE OF ATTRIBUTION DATA .................................................................................66
ATTRIBUTION ANALYSIS .............................................................................................67
DETERMINATION OF EECBG ATTRIBUTION..................................................................67
CONDUCT CARBON EMISSIONS REDUCTION ANALYSIS .................................69
9.1
10.
10.1
10.2
10.3
10.4
ASSESSMENT OF CARBON IMPACTS ............................................................................69
CONDUCT EMPLOYMENT ANALYSIS ..................................................................71
BROAD PARAMETERS OF JOBS ASSESSMENT ..............................................................71
ECONOMIC IMPACT MODEL FOR IDENTIFYING JOB IMPACTS ..........................................71
TRANSLATING EECBG PROJECT DIRECT EFFECTS INTO ECONOMIC EVENTS ................74
PRESENTATION OF JOB IMPACTS ................................................................................76
11.
CONDUCT ANALYSIS OF ORGANIZATIONAL/OPERATIONAL FACTORS
INFLUENCING OUTCOMES ....................................................................................................78
11.1
11.2
11.3
INCORPORATION OF OTHER DATA...............................................................................78
REGRESSION MODELING ............................................................................................78
OTHER STATISTICAL ANALYSIS APPROACHES..............................................................81
Oak Ridge National Laboratory
ii
February 9, 2012
Table of Contents
12.
12.1
12.2
13.
REPORTING AND PRESENTATION ......................................................................82
INTERIM REPORTS .....................................................................................................82
DRAFT AND FINAL REPORTS AND PRESENTATION ........................................................82
PROJECT MANAGEMENT AND ADMINISTRATION .............................................83
13.1
DESCRIPTION OF PROJECT MANAGEMENT TOOLS ........................................................83
13.2
SCHEDULE ................................................................................................................84
13.3
BUDGET ALLOCATION AND EXPECTED SPEND RATE.....................................................84
13.4
PROJECT TEAM AND RESPONSIBILITIES.......................................................................84
13.4.1 Project Organization Chart .................................................................................86
Oak Ridge National Laboratory
iii
February 9, 2012
Table of Contents
Figures
Figure 1-1: EECBG Evaluation Approach....................................................................................4
Figure 3-1: EECBG Data Collection Processing Flow ...............................................................17
Figure 3-2: EECBG/SEP TTA Regional Coordinators ..............................................................24
Figure 3-3: Structure, Purpose and Sequence of Survey Modules for Survey #1
Program Officer Survey......................................................................................................25
Figure 3-4: Structure, Purpose and Sequence of Survey Modules for Survey #2 Grant
Activity Project Manager Survey ........................................................................................27
Figure 3-5: Structure, Purpose and Sequence of Survey Modules for Survey #3
Performance Indicators Survey ..........................................................................................30
Figure 4-1: Certainty Allocations ...............................................................................................42
Figure 4-2: Illustration of Systematic Proportional Sampling .....................................................43
Figure 6-1: Survey Instrument Summary ...................................................................................49
Figure 7-1: Representation of Energy Savings from Retrofit.....................................................56
Figure 10-1: REMI Economic Forecasting Model – Basic Structure and Linkages....................72
Figure 10-2: Identifying Economic Impacts in the REMI Framework ..........................................74
Figure 10-3: REEM Framework for Energy Impact Analysis ......................................................76
Figure 13-1: Project Organization Chart ....................................................................................86
Tables
Table 1-1: BPAs Receiving Top 80% of Funding ........................................................................2
Table 2-1: BPAs Receiving Top 80% of Funding ......................................................................10
Table 3-1: List of Surveys and Their Objectives .......................................................................21
Table 4-1: Structure of grant and sub-grant recipients and activities for the 6 BPAs.................35
Table 4-2: Budget and Activity Count by BPA per PAGE – 12/18/2011 .....................................36
Table 4-3: Initial Proportional to Size Allocation by BPA and Metric Activity .............................39
Table 11-1: Example Factors Influencing EECBG Outcomes ....................................................80
Oak Ridge National Laboratory
iv
February 9, 2012
1.
Executive Summary
1.1
Program Description
The EECBG Program, authorized in Title V, Subtitle E of the Energy Independence and Security
Act (EISA) and signed into law on December 19, 2007, was funded for the first time by the
American Recovery and Reinvestment Act of 2009 (ARRA). The Funding Opportunity
Announcement (FOA) for Formula Grants was issued on June 25, 2009 and closed on June 25,
2010. Over $2.7 billion was distributed through Formula Grants to about 2,350 cities, counties,
states, territories and Native American tribes. This funding represents a Department of Energy
priority to increase energy efficiency activities and renewable energy installations across the
country while decreasing overall energy use and associated greenhouse gas emissions,
increasing jobs and stimulating the economy.
The Program was designed to enable grant recipients to create and implement strategies to:
Reduce fossil fuel emissions
Reduce total energy use
Improve energy efficiency in the building and transportation sectors.
Recipients were encouraged and given the flexibility to develop new and innovative approaches
across these three focus areas that would yield long-term sustainable impacts. Grants could be
used in any of 14 eligible Activity areas referred to in this document (also known as Broad
Program Areas, or BPAs). All funds were required to be committed within 18 months of award
and fully expended within 36 months. The six BPAs shown in Table 1-1 account for 80% of all
EECBG funding; the evaluation will focus on these six areas exclusively.
Oak Ridge National Laboratory
1
February 9, 2012
Table 1-1: BPAs Receiving Top 80% of Funding1
Top 6 Broad Program Areas
Energy Efficiency Retrofits
Financial Incentives
Buildings and Facilities
Onsite Renewables
Lighting
Energy Efficiency and Conservation
Strategy
1.2
Evaluation Objectives
The EECBG evaluation presents a complex challenge. Evaluators must understand the overall
objectives of the EECBG Program, the variations on the objectives present within each grant,
(and in the case of State grants, their sub-grants), and the variety of unique projects (referred to
as “Activities”) carried out under a grant. Much of the funding is directed to projects resulting in
direct energy impacts. Other components are structured to achieve market development and
transformation goals, and still others provide a platform to increase overall awareness and aid in
state and local long-term planning efforts.
The evaluation of the EECBG Program is intended to “document the Program’s principal
achievements and provide valuable information for policy makers and program managers to
help inform future energy efficiency and renewable energy efforts”. 2 This will require a
combination of qualitative and quantitative approaches designed to effectively communicate
both the direct energy impacts and the features that enabled success for grantees.
Employing data collected from existing EECBG databases and in-depth interviews with DOE
project officers, grantees, and other primary stakeholders, KEMA will answer the three key
research questions of this evaluation:
1
Based on review of data as of December 18, 2011; this order may change once the full dataset of grants
and sub-grants is reviewed.
2
As stated in the April 2011 EECBG Evaluation Plan original solicitation documents.
Oak Ridge National Laboratory
2
February 9, 2012
1) What is the total lifetime magnitude of energy and cost savings and other key outcomes
achieved in those BPAs that cumulatively account for approximately 80% of total
Formula Grant expenditures in the 2009-2011 program years?
2) What is the lifetime magnitude of outcomes achieved by each of the most heavilyfunded BPAs within the EECBG portfolio?
3) What are the key performance factors influencing the magnitude of EECBG
outcomes?
These questions will be answered based on evaluating a sample of 350 grants/activities from a
pool of 2,338 direct grants and over 5,000 sub-grants. The following six BPAs account for
approximately 80% of grant expenditures:
Energy Efficiency Retrofits
Financial Incentives
Buildings and Facilities
Onsite Renewables
Lighting
Energy Efficiency and Conservation Strategy
The evaluation will assess the following metrics:
Energy savings
Reduction in energy costs
Net job creation and productivity impacts
Impact on air quality and fossil fuel emissions
Use of federal, state and local government resources, private sector investment and
non-profit organizations services to increase program benefits
1.3
Evaluation Approach
The EECBG Program will be evaluated between August 2011 and December 2012 (16 months).
The implementation of the grants/sub-grants included in the study began in 2009 and may
continue through mid-2012, with approximately 20% of the grants having been completed at the
time the RFP was issued in early May 2011. Grants at all stages of completion are eligible for
sampling.
Oak Ridge National Laboratory
3
February 9, 2012
The study will include two key analyses: 1) calculate outcomes attributable to EECBG funding,
and 2) identify performance factors. The latter investigation is more exploratory in nature and
will yield important information for understanding “whether and to what extent the organizational
and operational [and other] factors examined influenced the achievement of key outcomes”
(Evaluation Plan, RFP p.11). The following figure summarizes the various components of the
EECBG Evaluation Study.
Figure 1-1: EECBG Evaluation Approach
The key components in the evaluation include:
Characterize the full set of EECBG program activities in terms of BPAs and
measures of size. In terms of the evaluation, the principal objectives of this step are to:
Oak Ridge National Laboratory
4
February 9, 2012
Develop the sample frame from which the individual program activities to be
evaluated will be selected and analyzed.
Develop the information needed to expand the results from the sampled program
activities to estimate total impacts for the BPA groups
Gather information on the level and quality of available program documentation,
which will be used to make final determinations of evaluation approaches to be taken
in regard to specific BPAs.
Develop the sample of individual program activities for evaluation. The KEMA
team will select a sample of at least 350 individual program activities from the total pool
of grants and sub-grants listed in the DOE database.
Assess the “evaluability” of the sampled individual program activities. The
Evaluation Team will develop a set of criteria for determining whether a program Activity
that is selected into the sample has adequate information concerning energy savings
actions taken to render a reasonable estimate of outcomes. The steps include the
following:
a. Confirm progress in implementation.
b. Confirm quality and availability of program records.
c. Eliminate non-energy-producing Activities, such as grants or sub-grants used for
administrative support, through the survey.
Conduct engineering desk reviews to estimate energy impacts of the selected
Activities. Each Activity selected in the sample will be assigned to a project engineer
for conducting an engineering desk review of all available data associated with the grant
and Activity.
For each selected individual Activity, we will quantify the energy savings. The savings
estimates will be based upon data and information from the following sources:
Activity data and documentation including grant applications and quarterly
reports from the PAGE database
Other databases including data maintained by DOE’s Golden, Colorado office
Oak Ridge National Laboratory
5
February 9, 2012
Telephone surveys with DOE Program Officers
Telephone surveys with grantee or sub-grantee project managers
Follow-up telephone interviews with grantee or sub-grantee project managers
who are directly involved and most knowledgeable of the activity to provide
additional unique Activity information to obtain information and data required for
the analysis that is not available from either the various DOE databases or
surveys
Information from state websites regarding the EECBG programs and, as
available and of good quality, the results of state- level evaluations of EECBG
projects.
These data will be combined with documented input assumptions and applied to
standard engineering formulae to estimate savings for all or a sample of participants.3
Attribute estimated energy impacts to the individual program activities. For
each selected Activity, KEMA will carry out an analysis to assess the portion of
estimated energy impacts that were attributable to the EECBG program activities in
the sample and other influences such as general developments in the market or the
activities of other organizations offering similar kinds of programs or services.
Attribution of effects will be assessed separately for each individual programmatic
Activity studied and be based on information collected from grantee decision makers
and other sources.
Estimate energy cost savings. For the selected Activity KEMA will calculate value
of annual energy savings and demand reductions at the current energy costs over
effective useful life of the Activity.
Estimate effects of individual activities on carbon emissions. We will use
estimates of annual and lifetime energy savings attributable to the program as inputs
to a model that estimates carbon emissions reductions based on the carbon content
of fossil fuels and electricity consumption avoided.
3
These approaches are commonly referred to as engineering-based assessment or statistically-adjusted
engineering assessment.
Oak Ridge National Laboratory
6
February 9, 2012
Estimate effects of individual activities on employment. The energy savings
estimates will be combined with other program information, such as matching funds
contributed, participant expenditures for labor and materials, and direct program
expenditure as inputs into a regional economic model to estimate net employment
impacts.
Once the individual Activity evaluations are completed and reviewed for accuracy and
completeness, the effort will shift to aggregation of sample results by BPA, projection to the
national level, and interpretation of findings. KEMA and its subcontractors will expand the
sample results to the top-funded BPAs using the relationship between verified metrics for the
sample activities and information on measures of size (funding).
Oak Ridge National Laboratory
7
February 9, 2012
1.4
Project Timeline
Key Tasks
Oct
2011
Nov
Dec
Jan
Feb
Mar
April
May
June
Development of Workplan
Sample Design
Wave 1 - Sample Selection
Data Base Development
Development of Survey Instruments
ICR Review Process
Assess the Feasibility to Perform Non-ICR Analysis
Conduct Non-ICR Analysis (if necessary)
Wave 1 - Desk Reviews
Refresh Data Base and Add TAG Results
Wave 2 - Sample Selection
Wave 2 - Desk Reviews
Savings Analysis
Job Creation Analysis
Emissions Analysis
Analysis of Performance Factors
Activity Aggregation
Draft Report
Final Presentation
Final Report
Oak Ridge National Laboratory
2012
July
Aug
Sept
Oct
Nov
Dec
3-Nov
10-Nov
22-Dec
8
February 9, 2012
2.
Introduction
This document presents the Work Plan for the EECBG Evaluation and will serve as a reference
over the next several months as the plan is implemented.
As the work proceeds, possible new information could cause the evaluation approach to be
adjusted. If this situation occurs, rather than revise this document we will issue an Addendum
to the Plan, explaining any alterations, which have been approved by ORNL, to the approach
outlined herein. Overall, the methods and approaches presented in this document should
provide adequate flexibility to allow for minor adjustments without sacrificing the reliability of the
evaluation findings. More significant project risks and mitigating strategies were considered by
the team and sponsors early in the project and are outlined in Appendix A.
2.1
Program Description
The EECBG Program, authorized in Title V, Subtitle E of the Energy Independence and Security
Act (EISA) and signed into law on December 19, 2007, was funded for the first time by the
American Recovery and Reinvestment Act of 2009 (ARRA). The Funding Opportunity
Announcement (FOA) for Formula Grants was issued on June 25, 2009 and closed on June 25,
2010. Over $2.7 billion was distributed through Formula Grants to about 2,350 cities, counties,
states, territories and Native American tribes. This funding represents a Department of Energy
priority to increase energy efficiency activities and renewable energy installations across the
country while decreasing overall energy use and associated greenhouse gas emissions,
increasing jobs and stimulating the economy.
The Program was designed to enable grant recipients to create and implement strategies to:
Reduce fossil fuel emissions
Reduce total energy use
Improve energy efficiency in the building and transportation sectors.
Recipients were encouraged and given the flexibility to develop new and innovative approaches
across these three focus areas that would yield long-term sustainable impacts. Grants could be
used in any of 14 eligible Activity areas referred to in this document (also known as Broad
Program Areas, or BPAs). All funds were required to be committed within 18 months of award
and fully expended within 36 months. The six BPAs shown inTable 2-1 account for 80% of all
EECBG funding; the evaluation will focus on these six areas exclusively.
Oak Ridge National Laboratory
9
February 9, 2012
Table 2-1: BPAs Receiving Top 80% of Funding4
Top 6 Broad Program Areas
Energy Efficiency Retrofits
Financial Incentives
Buildings and Facilities
Onsite Renewables
Lighting
Energy Efficiency and Conservation Strategy
2.2
Evaluation Objectives
The EECBG evaluation presents a complex challenge. Evaluators must understand the overall
objectives of the EECBG Program, the variations on the objectives present within each grant,
(and in the case of State grants, their sub-grants), and the variety of unique projects (referred to
as “Activities”) carried out under a grant. Much of the funding is directed to projects resulting in
direct energy impacts. Other components are structured to achieve market development and
transformation goals, and still others provide a platform to increase overall awareness and aid in
state and local long-term planning efforts.
The evaluation of the EECBG Program is intended to “document the Program’s principal
achievements and provide valuable information for policy makers and program managers to
help inform future energy efficiency and renewable energy efforts”. 5 This will require a
combination of qualitative and quantitative approaches designed to effectively communicate
both the direct energy impacts and the features that enabled success for grantees.
Employing data collected from existing EECBG databases and in-depth interviews with DOE
Program Officers, grantees, and other primary stakeholders, KEMA will answer the three key
research questions of this evaluation:
4
Based on review of data as of December 18, 2011; this order may change once the full dataset of grants
and sub-grants is reviewed.
5
As stated in the April 2011 EECBG Evaluation Plan original solicitation documents.
Oak Ridge National Laboratory
10
February 9, 2012
1) What is the total lifetime magnitude of energy and cost savings and other key outcomes
achieved in those BPAs that cumulatively account for approximately 80% of total Formula
Grant expenditures in the 2009-2011 program years?
2) What is the lifetime magnitude of outcomes achieved by each of the most heavily-funded
BPAs within the EECBG portfolio?
3) What are the key performance factors influencing the magnitude of EECBG outcomes?
These questions will be answered based on evaluating a sample of 350 grants/activities from a
pool of 2,338 direct grants and over 5,000 sub-grants. The following six BPAs account for
approximately 80% of grant expenditures:
Energy Efficiency Retrofits
Financial Incentives
Buildings and Facilities
Onsite Renewables
Lighting
Energy Efficiency and Conservation Strategy
The evaluation will assess the following metrics:
Energy savings
Reduction in energy costs
Net job creation and productivity impacts
Impact on air quality and fossil fuel emissions
Use of federal, state and local government resources, private sector investment and
non-profit organizations’ services to increase program benefits
2.3
Evaluation Approach
The EECBG Program will be evaluated between August 2011 and December 2012 (16 months).
The implementation of the grants/sub-grants included in the study began in 2009 and may
continue through mid-2012, with approximately 20% of the grants having been completed at the
time the RFP was issued. Grants at all stages of completion are eligible for sampling.
The study will include two key analyses: 1) calculate outcomes attributable to EECBG funding,
and 2) identify .key factors that influence the magnitude of the outcomes achieved. The latter
investigation is more exploratory in nature and will yield important information for understanding
Oak Ridge National Laboratory
11
February 9, 2012
“whether and to what extent the organizational and operational [and other] factors examined
influenced the achievement of key outcomes.” (Evaluation Plan, RFP p.11)
The key components in the evaluation include:
Characterize the full set of EECBG program activities in terms of BPAs and
measures of size. In terms of the evaluation, the principal objectives of this step are to:
Develop the sample frame from which the individual program activities to be
evaluated will be selected, analyzed, and the results for the individual program
activities will be expanded to the full program
Provide input data to support sample design, including the definition of metric
activities and the allocation of sample resources to the final set of sample activities
Develop the information needed to expand the results from the sampled program
activities to estimate total impacts for the BPA groups
Gather information on the level and quality of available program documentation,
which will be used to make final determinations of evaluation approaches to be taken
in regard to specific BPAs.
Develop the sample of individual program activities for evaluation. The KEMA
team will select a sample of at least 350 individual program activities from the total pool
of grants and sub-grants listed in the DOE database. See Section 4 for a description of
the objectives, methods, and preliminary design of the sample selection process. Once
an Activity is selected into the sample, the KEMA team will deploy the evaluation in the
following steps.
Assess the “evaluability” of the sampled individual program activities. The
Evaluation Team will develop a set of criteria for determining whether a program Activity
that is selected into the sample has adequate information concerning energy savings
actions taken to render a reasonable estimate of outcomes. The steps include the
following:
a. Confirm progress in implementation. In order to be included in the impact
analysis, the selected program Activity must be either completed or far enough along
in the implementation process to provide an accurate characterization of the Activity.
For example, a municipal street lighting Activity may be only 10% completed, but all
Oak Ridge National Laboratory
12
February 9, 2012
of the lighting equipment has been purchased and awaiting installation; thus, it would
be included in the analysis. In contrast, the installation of a custom energy
management system Activity, where the final operating specifications and
characteristics are still in flux, would be excluded from the analysis.
b. Confirm quality and availability of program records. KEMA will review the Activity
data from the Performance and Accountability for Grants in Energy (PAGE) data and
any supplemental data provided by the DOE’s Golden Colorado office for
completeness and quality. 6 If such significant data elements are missing or appear
to be erroneous and cannot be reconstructed within schedule and budget
constraints, then the program Activity will be removed from the sample and a
substitute selected.
c. Eliminate non-energy-producing Activities through the survey. The third step of
assessing evaluability occurs during the survey of grantee or sub-grantee project
managers in cases where it is discovered that no actions were taken that result in
energy savings.
Conduct engineering desk reviews of sampled grants to estimate energy impacts
of the selected Activities. Each Activity selected in the sample will be assigned to a
project engineer for conducting an engineering desk review of all available data
associated with the grant and Activity.
For each selected individual Activity, we will quantify the energy savings. The savings
estimates will be based upon data and information from the following sources:
Activity data and documentation including grant applications and quarterly
reports from the PAGE database
Other databases including data maintained by DOE’s Golden, Colorado office
Telephone surveys with DOE Program Officers
Telephone surveys with grantee or sub-grantee project managers
6
DOE’s Golden office is in the process of integrating its data into the PAGE database. If the integration
process is not completed by the time the saving analysis is conducted, the KEMA team may need to
obtain program Activity data directly from the Golden office.
Oak Ridge National Laboratory
13
February 9, 2012
Follow-up telephone interviews with grantee or sub-grantee project managers
who are directly involved and most knowledgeable of the activity to provide
additional unique Activity information to obtain information and data required for
the analysis that is not available from either the various DOE databases or
surveys7
Information from state websites regarding the EECBG programs and, as
available and of good quality, the results of state level evaluations of EECBG
projects.
These data will be combined with documented input assumptions and applied to
standard engineering formulae to estimate savings for all or a sample of participants. 8
Attribute estimated energy impacts to the individual program activities. For each
selected Activity, KEMA will carry out an analysis to assess the portion of estimated
energy impacts that were attributable to the EECBG program activities in the sample and
other influences, such as general developments in the market or the activities of other
organizations offering similar kinds of programs or services. Attribution of effects will be
assessed separately for each individual programmatic Activity studied and will be based
on information collected from grantee decision makers and other sources.
See Section 8 for more discussion of our approach to determining attribution.
Estimate effects of individual activities on carbon emissions. We will use estimates
of annual and lifetime energy savings attributable to the program as inputs to a model
that estimates carbon emissions reductions based on the carbon content of fossil fuels
and electricity consumption avoided. See Section 9 for a description of this analysis.
Estimate effects of individual activities on employment. The energy savings
estimates will be combined with other program information, such as matching funds
contributed, participant expenditures for labor and materials, and direct program
7
Each information requested from grantee and sub-grantee project managers will be specific to the
individual Activity and will differ from Activity to Activity and will not require an OMB approved survey
instrument..
8
These approaches are commonly referred to as engineering-based assessment or statistically-adjusted
engineering assessment.
Oak Ridge National Laboratory
14
February 9, 2012
expenditure as inputs into a regional economic model to estimate employment impacts.
See Section 10 for a description of these analyses.
Once the individual Activity evaluations have been completed and reviewed for accuracy and
completeness, the effort will shift to aggregation of sample results by BPA, projection to the
national level, and interpretation of findings. KEMA and its subcontractors will expand the
sample results to the top-funded BPAs, using the relationship between verified metrics for the
sample activities and information on measures of size (funding).
2.4
Work Plan Structure
The remainder of the Work Plan discusses the approach to the EECBG evaluation in detail.
Section 3 reviews the Data Collection approach including a list of existing data sources
plus plans for augmenting what is available through PAGE and other available
databases with surveys.
Section 4 presents preliminary results of our review of PAGE data, and the subsequent
Sampling Plan for selection of the 350 grant Activities to be studied.
Section 5 shares the objectives and steps to be taken in the Review of EECBG
Databases.
Section 6 discusses implementation of the three surveys that constitute the primary new
information to be used in the analysis.
Section 7 presents the plans for estimating the primary impacts of the projects and how
these impacts will be aggregated to arrive at BPA level results.
Section 8 discusses the approach to assessing attribution of the impacts to EECBG.
Section 9 presents the approach to determining carbon reductions that will result from
the energy savings impacts.
Section 10 addresses the analysis of employment effects due to EECBG grant Activity.
Section 11 is a discussion of the planned analysis of organizational/operational factors
influencing project outcomes.
Section 12 lays out the reporting structure and schedule for the project.
Section 13 is the management and organizational plan for the EECBG evaluation.
Oak Ridge National Laboratory
15
February 9, 2012
3.
Data Collection
The evaluation of the EECBG program will be based upon information obtained from three key
data sources:
PAGE and other DOE and OMB databases and Activity documentation and records
reported by grantees and sub-grantees
Telephone surveys with DOE Program Officers and with grantee/sub-grantee project
managers including:
Survey #1: Program Officers Survey - In-depth interviews with DOE Program Officers,
State and Regional Coordinators
Survey #2: Grant Activity Manager Survey (GAMS) - Telephone surveys with Grant
Activity Project Managers who are closest to the activities conducted under each Grant
Activity sampled, with two versions of the instrument as follows:
a) Survey #2A – Residential Grant Activity Manager Survey (ResGAMS)
b) Survey #2B – Non-Residential Grant Activity Manager Survey (Non-ResGAMS)
Survey #3: Performance Factors Survey - Telephone surveys with Grant Managers of
the sampled Activities
Follow-up in-depth interviews with grantee/sub-grantee project managers to obtain
additional activity specific information required for the evaluation that was not provided in
the program databases or in the telephone surveys.
Figure 3-1 shows the relationship between these three primary data sources and key
analytical components of the evaluation.
Oak Ridge National Laboratory
16
February 9, 2012
Figure 3-1: EECBG Data Collection Processing Flow
The following section describes each data collection Activity and how the three activities will
form the basis for evaluating the EECBG program.
The additional sources of information that may be used, as available, are data from program
evaluations that may be undertaken by states or grantees. As this is not a reporting
requirement for grantees, we do not anticipate that all sampled jurisdictions will have an
evaluation performed. Further, we cannot say at this point what the quality or
comprehensiveness of the evaluations will be. Even so, the project team will identify and review
any evaluations that are available at the time of the analysis to consider whether any of the
information contained therein might be useful to this effort.
Oak Ridge National Laboratory
17
February 9, 2012
3.1
Reporting Data and Activity Documentation
The evaluation will incorporate an in-depth review of the data that grantees and sub-grantees
are required to report to the Office of Management and Budget (OMB) and DOE 9 on a quarterly
basis. The type of information will include the following:
Quarterly reporting to OMB (federalreporting.gov) - Required of grantees, may be delegated
to sub-grantees
Total amount of ARRA funds received from DOE
Amount of ARRA funds expended or obligated to projects or activities
Detailed list of all projects or activities
Information on subcontracts or sub-grants awarded by Prime Recipient
Quarterly reporting to DOE (PAGE) – Required of all EECBG Grantees
All Prime Recipients are required to report quarterly through PAGE
o Allocations >$2M required to report a subset of the quarterly data on a monthly
basis into PAGE (Performance Accountability for Grants in Energy). Reporting
may be delegated to sub-recipients.
Two additional reports
o
o
Federal Financial Report (SF-425)
Performance Report (at the level of Activity)
Activity Status
Activity Milestones
Three categories of metrics
o
9
Financial Metrics
Progress Metrics
Jobs/Hours Worked
DOE EECBG Program Notice Effective April 21, 2010, formula grant reporting guidance.
Oak Ridge National Laboratory
18
February 9, 2012
o
Standard Program Metrics – outlay and obligation of funds, amount of Activity
completed
Grant Reporting and Analysis Software System (GRASS) – Compliance and monitoring
data provided by EECBG DOE Program Officers
Inputs based on information grantees submit to PAGE and on findings from monitoring
desk reviews/visits
Primarily compliance and procedural in nature
Narrative component can provide insight into project/program accomplishments,
challenges and keys to success
Grant Close-Out Tagging Process – Final data requirements collected via interview with
grant recipients when grants are concluded.
DOE’s Golden, Colorado office is responsible for grant close-out activities. As part of this
process, they have developed a “Tagging Process” whereby more detailed information on actual
activities and actions taken are being collected in recognition that this is the last time that
information may be captured on what was accomplished. The developers of the process
shared their data collection form with KEMA and ORNL and enabled us to provide comments
and suggestions for evaluation purposes.
It is the intent of DOE to have the results of the Tagging Process incorporated into PAGE and
appended to each grant/data file upon project close out. 10 A pilot test of the process is being
undertaken in the first quarter of CY2012, with the full process implemented following the pilot.
KEMA will verify whether the Activities selected for the sample have the additional tagging data
available for use in the evaluation.
The data will consist of various elements including:
Post-grant verification of BPAs and Activity level categorization
Confirmation and more detail concerning any buildings treated
10
DOE is expected to complete this process in early 2012. However, if tagging process is completed
after the engineering desk review process has begun or if the comprehensiveness of the data is not
sufficient, this information will not be included in this study.
Oak Ridge National Laboratory
19
February 9, 2012
Confirmation and more detail concerning end-uses of energy that were addressed
Confirmation and more detail concerning specific measures installed
Collection of additional detail on building characteristics
3.2
Survey Instruments
3.2.1
Objectives
The scope of this project involves a combination of careful reviews of grant status reports and
applications plus three primary survey data collection activities:
Survey #1: Program Officers Survey - In-depth interviews with DOE Program Officers,
State and Regional Coordinators
Survey #2: Grant Activity Manager Survey (GAMS) - Telephone surveys with Grant
Activity Project Managers who are closest to the activities conducted under each Grant
Activity sampled, with two versions of the instrument as follows:
Survey #2A – Residential Grant Activity Manager Survey (ResGAMS)
Survey #2B – Non-Residential Grant Activity Manager Survey (Non-ResGAMS)
Survey #3: Performance Factors Survey - Telephone surveys with Grant Program
Managers of the sampled Activities
Grant Program Managers (Survey #3) are state employees who oversee the distribution and
administration of the EECBG grants. Grant Activity Managers (Survey #2) are state or local
employees who have been directly involved in the implementation of a specific Activity. For
some Activities, the Grant Program Manager may also serve as the Grant Activity Manager.
The objective of the interviews and surveys, shown in Table 3-1, are to assemble critical data
necessary for answering the three key research questions in this study.
Oak Ridge National Laboratory
20
February 9, 2012
Table 3-1: List of Surveys and Their Objectives
Survey
Survey #1: DOE Program
Officers, State and Regional
Coordinators (Program Officer
Survey)
Survey #2: Grant Activity
Project Managers (GAMS) – 2
versions:
Survey #2A: Residential
Grant Activity Project
Managers
Survey #2B: NonResidential Grant Activity
Project Managers
Survey #3: Grant Program
Managers (Performance
Factor Survey)
3.2.1.1
Objectives
a) Identify the best person to respond to Survey #2
regarding building and measure-level data
b) Obtain their perspective on the sampled grant activities
under their jurisdiction
c) Collect data regarding possible key Performance
Factors
a) Confirm proper categorization of the sampled Activity
b) Verify data from PAGE and other sources as to the
project description and what energy saving actions
were taken
c) Gather additional detail regarding buildings treated,
equipment and measures installed, persistence of
measures, changes in operations and building and
measure characteristics to enable calculation of energy
savings
a) Collect data regarding possible key Performance
Factors
Discussion of Data Collection Approach
The data collection approach followed by KEMA is to conduct in-depth telephone interviews with
DOE Program Officers, State and Regional Coordinators, and telephone surveys with Grant
Program Managers and Grant Activity Project Managers.
The in-depth interviews (Survey #1 – Program Officer Survey) will be guided by an interview
protocol with a series of broad questions regarding the relative performance of EECBG grants
within their portfolio, and a series of questions specific to the sampled grant Activities within
their jurisdiction. Critical to these interviews is the identification or confirmation of one individual
who has the most knowledge about the specific Activity(ies) selected for evaluation. DOE
Program Officers/State and Regional Coordinators who manage grant portfolios where no Grant
Activities are sampled will only be asked the broader series of questions regarding grant
performance in general.
Telephone surveys of Grant Activity Managers (Survey #2A and #2B) will then be employed to
confirm information collected from the PAGE database and other program data sources (as
Oak Ridge National Laboratory
21
February 9, 2012
described in Section 3.1) and obtain more detailed information necessary for the calculation of
energy savings. It is recognized that in many cases the PAGE database may not contain
sufficient detailed measure data; and therefore, these surveys are designed to obtain an
understanding of the Activity from the Program Officer’s perspective and to obtain contact
information for the Grant Activity Project Manager.
KEMA will conduct the three surveys described above in sequence, starting with DOE Program
Officers, and the State and Regional Coordinators (Survey #1). Once the review of the EECBG
databases and Program Officer Surveys have been completed, a customized survey instrument
will be created from the broad survey instruments described above (Survey #2A and #2B) for
each of the 350 Activities, populating the first section of the survey with basic information such
as Grant Broad Program Area designation, a Grant Activity description (the basis of sample
selection), grant amount, and contact information for the individual being interviewed. The
Grant Activity Manager surveys will then be scheduled by the KEMA team and are anticipated to
take approximately one hour to complete, with the expectation that simpler activities (treated
one building or several facilities with one type of measure, etc.) will take less time than more
complex activities.
Finally, telephone surveys of Grant Managers (Survey #3) will then be employed to collect data
to help determine what factors influence the performance of a grant or Activity.
3.2.1.2
Benefits and Resource Efficiency
There are several efficiencies built into the data collection process. First, the survey
instruments are designed into modules or sections where respondents may skip entire groups of
questions that do not apply to their situations. The survey instruments themselves are quite
long because they must take into account all potential situations and scenarios. In executing
the surveys, however, respondents will only be asked questions that apply to their sampled
Activity. The vast majority of interviews will involve only a small subset of the overall survey
sections. For example, it would be a rare situation that any one interviewee would be subject to
the entire set of questions in Survey #2. (That would mean that a facility was treated with
measures in all categories including an on-site renewable energy system.)
A second efficiency built into the Work Plan is that there will be extensive reviews of the EECBG
databases, which will populate, when possible, much of the survey data in Survey #2 up front.
Accordingly, a large part of the telephone survey will consist of verifying information that KEMA
already obtained from the PAGE data set and documentation for the Activity.
Oak Ridge National Laboratory
22
February 9, 2012
A third efficiency results from the planned implementation of the surveys, in that the person
conducting the review of EECBG databases will also conduct the DOE Program Officer calls,
and to the extent possible populate the survey instrument with data, review the results of the
GAMS telephone survey instrument (Survey #2), and conduct any follow-up calls necessary to
clarifying the information collected. By having one person follow the investigation for a Grant
Activity from start to finish, KEMA seeks not only a more cohesive and efficient process, but a
better quality result and product. It is understood that the primary data collection source rests
with the technical staff of the grantees doing the projects in their facilities and that all other data
will be used only when it can be considered complete and equally reliable enough that grantee
interviews with technical teams are not required.
3.2.2
Survey #1 – DOE Program Officers, Regional and State
Coordinators
DOE Program Officers located in the 50 states and 5 territories are responsible for overseeing a
portfolio of EECBG grants within their geographical jurisdictions. A second tier of oversight for
EECBG grants is provided by DOE Regional and State Coordinators. Through DOE’s Technical
Assistance Network, there are State and Regional Coordinators who engage with all grantees
(SEP, EECBG) on a regular basis. While they are responsible for coordinating technical
assistance needs through a network of subject-matter expert teams, they engage with all
grantees in their area on many levels. Some coordinators have a deep understanding of
grantee programs, program/project players, obstacles, and successes. They provide regional
peer-to-peer opportunities for grantees to learn from one another and in general “keep their
finger on the pulse” of grantee activities.
Seventeen regional coordinators located around the country, as shown in Figure 3-2, provide
assistance to EECBG grantees regarding a range of subjects. KEMA will include these
individuals in the Program Officer Survey sample.
Oak Ridge National Laboratory
23
February 9, 2012
Figure 3-2: EECBG/SEP TTA Regional Coordinators
Oak Ridge National Laboratory
24
February 9, 2012
Figure 3-3: Structure, Purpose and Sequence of Survey Modules for Survey #1 Program
Officer Survey
Survey #1
Phone interviews to discuss samples grants and
activities, and to confirm/identify the best
DOE Program Officers (n= 55)
person to interview for Survey # 2 and Survey #3.
DOE Technical Assistance State & Regional Coordinators (n =17)
Survey Detail
Section
Title
Purpose
Section A
Introduction
Explain purpose of study & survey
Section B
Scope of Grants
Confirm info regarding no. & types of grants/activities under respondent’s responsibility; Identify grant characteristics
Section C
Sampled Activities
Verify activity desc. & funding details; ID other sources, other evaluations performed/in process
Section D
Contact Info for Activities
ID Grantee Contact for Surveys, ID contact info for Grant Activity Managers for sampled activities
Section E
Performance Factors
Collect info on perceived performance factors of successful activites/grants
If PO or Coordinator does not oversee a Sampled Activity, sections C and D will not be asked
There is only one core instrument under Survey #1. Those respondents overseeing sampled
Activities will be administered the full survey, Sections A – E. Others who do not have sampled
Activities will only be asked Sections A, B and E.
3.2.3
Survey #2 –– Local Grant Activity Manager Survey (GAMS)
This survey is the heart of the evaluation in that it is used to verify self-reported data on the
specific activities sampled for energy savings calculations. It is also the critical source of data
beyond that which is found in PAGE or the other data sources identified since it collects
information directly from grantee and sub-grantee that are directly involved in the Activity.
Figure 3-4 outlines the sequence and content of Survey #2.
Survey 2 consists of two versions:
Survey 2A: Residential GAMS – Survey questions for Grant Activities targeted to
residential end users of energy, residential buildings, and residential appliances and
measures.
Survey 2B: Non-Residential GAMS – Survey questions for non-residential Grant
Activities including those focused toward municipal buildings, commercial or business
establishments, industrial end user facilities, and the equipment and systems they
contain.
Oak Ridge National Laboratory
25
February 9, 2012
Both survey instruments follow the same general format with the content of the questions
tailored to relate to residential or non-residential buildings. The primary objective of this survey
is to collect detailed technical information required to calculate savings estimates. The surveys
address the following topics:
Introduction and screening for correct respondent
Confirmation of Broad Program Area categorization
Respondent’s role in the Activity
Building and firmographic characteristics
Verification of inventory by end use and measure
Attribution
Oak Ridge National Laboratory
26
February 9, 2012
Figure 3-4: Structure, Purpose and Sequence of Survey Modules for Survey #2 Grant Activity Project Manager Survey
Survey #2
Phone survey to individual most closely associated
with specific sampled grant activity
Grant Activity Managers
n= 350
Separate version for
residential / nonresidential
Survey 2A
Residential Sector
n = portion of 350
Survey Detail
Section
Title
Survey 2B
Non-Residential
Sector
n = portion of 350
Residential?
q
q
q
q
q
q
Purpose
Section A
Background
Basic grant info/filled in prior to survey
Section B
Introduction
Explain purpose of study & survey
Section C
Screener
Confirm appropriate respondent
Section D
Grant Funding Details
Verify grant activity & funding detail; ID other sources
Section E
Targeted Customer Segments
ID activity target market
Section F
Confirm Broad Program Area
Separate survey into BPA tracks
Section G
Housing Unit/Facility Information Assess # & characteristics of buildings treated
Section H
Measures Installed
Installed measures detail
Section I
Attribution
Determine impacts attributable to EECBG Grant
Section J
Jobs Created/Retained
Confirm # jobs resulting from EECBG Grant
1. EE Retrofits
2. Financial Incentives
3. Buildings & Facilities
4. Lighting
5. On-Site Renewables
6. EE & Conservation Strategies
Activity
Description
If no buildings or facilities are treated, survey is terminated at Section F
q
q
q
q
q
q
1. Lighting
2. Cooling
3. Heating
4. Refrigeration/Appliances
5. Other Equipment
6. On-Site Renewables
Installation
confirmatio
n, & detail,
including
operation &
persistence
Apply only for those activities involving those measures; may ask for one or a group of buildings
Oak Ridge National Laboratory
27
February 9, 2012
The Residential and Non-Residential GAMS will be administered to those individuals identified
in Survey #1 as being the most knowledgeable about each respective sampled Activity. In
some cases, it may be the Grant Program Manager who is also closest to the Activity within the
grant; whereas in others, particularly those with multiple activities carried out under one Grant, it
is more likely that another individual was responsible for carrying out the actual project(s) under
that Activity. We will begin by verifying the name of the contact person listed in PAGE for each
Activity with the DOE Program Officer in Survey #1; if they have a more appropriate person, we
will collect their name, position and contact information.
The survey will start by verifying that we have the correct person on the phone, and that
adequate time is set aside for the interview. It is important to identify the person most
knowledgeable about the Activity. If the initial contact person is not the appropriate contact, the
interviewer will ask for the contact information for the appropriate person. All calls will be
scheduled ahead of time to allow the respondent to prepare for the discussion and set aside the
time necessary to complete the survey (estimated at up to one and a half hours for particularly
complex activities, to a minimum of 40 minutes for those activities involving limited buildings and
measures).
3.2.3.1
Verification of Measures and Actions
As shown in Figure 3-4, the BPA categories of Energy Efficiency Retrofits (1), Lighting (4) and
On-Site Renewables (5) activities are likely to most directly involve actual installations of
measures in buildings, because of the nature of the category. Thus, we anticipate being able to
proceed relatively quickly to energy savings related questions in Sections F, G and H. For the
other BPA categories, a sequence of questions unique to each category must be posed before
one can determine whether buildings or facilities are actually treated (directly or indirectly), what
types and how many, whether any information is available on those buildings/treatments, and
whether the respondent has the knowledge to be able to provide that information. If it is
discovered that no actual energy savings actions were taken for an Activity selected in the
sample, the survey will be concluded at that point. The selected Activity will be replaced with
another Activity. The sampling methodology and replacement protocols are discussed in
Section 4.
Two examples of possible outcomes for these categories are provided:
Example 1: Financial Incentive Program Activity (F2) - An Activity selected into the sample
under the Financial Incentive Program BPA may be determined to consist of a loan program for
small businesses to replace lighting systems. The survey will therefore probe what types of
Oak Ridge National Laboratory
28
February 9, 2012
non-residential buildings were targeted (small business), how many actual small business
facilities were treated using loans given out under the Activity, how many lighting measures
were installed in those facilities and what types. More information will be sought regarding
what kinds of equipment were replaced, hours of use data for the facilities, and other
information necessary for developing an estimate of energy savings. It should be noted that
some of this information will be collected as part of Survey #2, and some may require a followup interview, as described below in section 4.2.4.
Example 2: Energy Efficiency and Conservation Strategy Program Activity (F6) – An
Activity selected under this BPA has a greater chance of not resulting in specific treatments
made to a building or facility, due to the nature of the activities described under this category.
Most often activities under this BPA consist of indirect energy savings projects, such as
development of a Community Sustainability Plan or other policy, communications and
educational projects. Some jurisdictions may have information about specific buildings treated
or actions taken as a direct or indirect result of such activities. The Survey in Section F6 will
seek to determine whether any buildings were actually treated, how many, what types, and with
what treatments. If no buildings are known to have been treated (or other energy savings
actions taken), then the survey will be terminated after Section F, the respondent thanked and
the call ended.
3.2.3.2
Attribution Questions
Following the customized questions from sections F, G and H above, all respondents will be
guided to Section I for a series of questions related to attribution. These questions are based
upon industry-standard methods of probing for the extent to which the specific intervention – in
this case, the funding from the EECBG grant – influenced the actions taken. In the case of
many EECBG activities, it is likely that other funding sources were tapped to complete the
project; whereas in others, the entire project may have been paid for exclusively with EECBG
grant dollars. Beyond the question of funding, attribution questions also deal with the decisionmaking process. Was the project planned prior to the seeking of funding from EECBG? Would
it have gone forward without EECBG funding?
The results of these questions will feed the analysis of attribution by applying a factor to the
energy savings reductions achieved. If the EECBG grant is the primary source of influence,
then the energy savings and demand reduction impacts will not be adjusted downward. If,
however, the EECBG grant was only one factor in the decision to proceed with the project, or if
the project had multiple funding sources, then the energy savings reductions will have to be
adjusted to account for the various influences on project outcomes.
Oak Ridge National Laboratory
29
February 9, 2012
Section 8 of this Work Plan explains the approach to assigning attribution in more detail.
3.2.4
Survey #3 – Performance Factors Survey
The third survey involves collection of data related to the performance of grants and activities to
help determine what factors influence the performance of a grant or Activity. Section 11 of this
Work Plan describes KEMA’s strategy for this analysis in detail. For this survey, KEMA
envisions a brief introductory section followed by verification/collection of data on the
characteristics of the Activity sampled in Section C. This will be followed by questions related to
the various factors that may influence the performance of the grant/sub-grant activities, such as
number of staff devoted to the project, the number of times that the grantee took advantage of
available Technical Assistance, etc. Other questions in this section may relate to the
understanding and perceptions of the Grant Program Manager regarding qualitative factors
(enthusiasm of the Activity Manager, history of the jurisdiction in conducting previous projects,
or the general economic health of the jurisdiction receiving the grant, etc.)
Figure 3-5 illustrates the sequence and content of Survey #3.
Figure 3-5: Structure, Purpose and Sequence of Survey Modules for Survey #3
Performance Indicators Survey
Oak Ridge National Laboratory
30
February 9, 2012
In order to capture the same information from various perspectives, some of the questions
posed to Grant Program Managers in Survey #3 will also, as appropriate, be asked of DOE
Program Officers, Regional and State Coordinators, and Grant Activity Managers as part of
Surveys 1 and 2.
3.3
Follow-up Interviews with Grantee and Sub-grantee
Activity Project Managers
After the GAMS surveys are completed, the data for each Activity will be reviewed to determine
whether there are any remaining gaps in the data needed for calculating energy savings, or
clarifications required to the responses provided. It is important to recognize that the primary
purpose of the GAMS surveys is to elicit the information necessary to conduct the savings
analysis. However, it is expected that some Activity information will be unique to the specific
Activity and will not be feasible to collect using the GAMS surveys. In those instances, the desk
reviewers will construct a customized set of questions to address the gaps, and make a follow
up call to the Grant Activity Project Manager. Given the diversity in Activities and the
preliminary review of the DOE databases, it is expected that follow-up interviews will be required
for the majority of the 350 sampled Activities.
In addition, it may be determined during the calls for Survey #2 that another individual is needed
to address some section of the Survey #2 questionnaire. If that is the case, then the survey can
be terminated, or a section skipped, for follow up with the other individual in order to collect the
best information possible.
Oak Ridge National Laboratory
31
February 9, 2012
4.
Program Data Characteristics and Sample Design
This chapter summarizes the initial analysis of the PAGE database and describes the proposed
sample design and framework, with some details based on the initial analysis of the PAGE data
base.
To describe the sampling approach, we discuss the following:
1. Definition of the sampling frame: what are the units or elements that can be selected for
the sample? How are these units defined and identified? Which units are included in
the study and which are excluded?
2. How will the sample be stratified?
3. How will the sample points be allocated to the stratification cells?
4. Staging: how will the sample be staged over time?
4.1
Sampling Frame
4.1.1
Structure and Sampling Units
4.1.1.1
Overview
This study will be conducted by selecting a random sample of EECBG-funded Activities for
evaluation, and estimating the full program savings based on the evaluation results for this
sample. To develop the sample, it is first necessary to define what the units or elements are
that can be selected for evaluation. The list of all the units or elements that can be selected,
and that in turn are formally represented by the study, is the sampling frame. For this study, the
units or elements in the sampling frame are “Activities.”.”
The RFP specified that a sample of “Activities” will be selected for evaluation. An “Activity” is an
initiative conducted under a single grant or sub-grant to a particular agency. The Activity may
be an individual energy efficiency project at a particular facility, a group of energy efficiency
projects at one or more facilities, or broader initiatives or programs not identified with specific
facilities or projects.
Oak Ridge National Laboratory
32
February 9, 2012
A record in the PAGE database is identified by its “Activity Worksheet Unique ID.” A single
grant may have multiple Activity Worksheet Unique IDs. As described above, some at least
60% of grants to states and territories are further distributed in sub-grants. Sub-grant detail is
not provided in PAGE. However, a single sub-grant also may have multiple initiatives within it,
that can be considered to be separate “Activities,” but that are not identifiable within PAGE.
The sampling frame to be constructed therefore will include two types of Activities
1. Grant-level Activity IDs: An Activity Worksheet Unique ID in PAGE that does not involve
sub-grants
2. Sub-grant-level Activity IDs: A sub-grant or component of a sub-grant.
To construct the sample frame, it will therefore be necessary to compile information on activities
within sub-grants similar to that available in PAGE for non-sub-grantee Activity Worksheet
Unique IDs. The Golden program office has indicated they will provide this sub-grant-level
information during the first quarter of CY2012.
4.1.1.2
Further discussion
EECBG grants are given to three types of entities: local government agencies, tribes, and
state/territorial agencies. Most of the state/territorial grants are distributed to other agencies via
sub-grants. Thus, local government agencies can receive EECBG funding either via a direct
grant from DOE, or via a sub-grant from the state/territorial agency that received the DOE grant.
The state/territorial grants represent a small fraction of the total number of DOE grants (55 out
of 1696) but account for over a quarter of the total funding, most of it disbursed to other
agencies.
As described in the RFP, “’Activities’ are the basic building blocks of the [EECBG] Program and
refer to specific actions taken by individual grant recipients.” Thus, the Activity will be the
common ultimate unit of selection for the sample. As described in the RFP and in KEMA’s
proposal, a total of 350 Activities will be selected.
However, for the state/territorial grants that are distributed as sub-grants, the specific actions
ultimately taken are not listed within the grant information in the PAGE data base, but only in
more detailed information expected to be available for each sub-grant.
Oak Ridge National Laboratory
33
February 9, 2012
For each DOE grant, the PAGE database lists the individual efforts funded by the grant, the
Activity type, description, contact information, and associated funding amount. For each effort
within a grant, there is a record in the database identified by an Activity Worksheet Unique ID.
For the direct grants to local governments and tribes, the Project Descriptions indicate fairly
specific actions, such as efficiency improvements to specific facilities or infrastructure, a
particular educational program, or a particular planning process. For the state/territorial grants,
the Project Descriptions describe sub-grant processes or assistance programs to be
implemented with the funds. Specific actions, improvements, or programs are not reported in
PAGE. However, we would like to conduct evaluations, and to select units to be evaluated, at
this granular level of specific actions, improvements, or programs. We do not want to select an
entire state grant for evaluation. Rather, we want to develop the more fine-grained detail on
Activities within the sub-grants, and draw a sample of these Activities.
The DOE Office in Golden, CO, which administers the state/territorial grants and other large
grants ($2 million or more), committed to compiling a database of specific activities within each
sub-grant. This database will provide details for each specific Activity similar to that available
for direct grants in PAGE. We anticipate that this database will be available by the end of the
first quarter of CY2012.
We therefore propose to define the sampling unit for the selection of 350 activities as an
Activity, defined as one of the following:
An Activity corresponding to an Activity Worksheet Unique Id in PAGE, for grants not
redistributed as sub-grants.
A specific, uniquely identifiable Activity defined within sub-grants based on additional
data for state/territorial grants re-distributed as sub-grants.
In the latter case, there may be multiple Activities within a single Activity Worksheet Unique Id in
PAGE.
Table 4-1 indicates the distribution of grants and activities based on the recent PAGE data as of
December 18, 2011. The highlighted values for number of sub-grants, number of Activities
within the sub-grants, and corresponding average budget per Activity for sub-grantees are
approximations at this point. These approximations are based on the rough estimates in the
RFP of the numbers of sub-grants.
Oak Ridge National Laboratory
34
February 9, 2012
Table 4-1: Structure of grant and sub-grant recipients and activities for the 6 BPAs
% of
Budget
Average
Budget
per
Grant
($M)
Average
Budget
per
Activity
($1000)
# Sample
Points
Allocated
Prop’l to
Budget
1.60
70%
1.1
313
244
0.055
0.05
2%
0.1
79
7
10000
0.767
0.65
28%
13.9
77
99
16700
2.702
2.3
100%
1.2
162
350
Number
Grants
Number
Subgrants
# of
Activities
Budget
$B
1696
4.3
6000
1.88
Tribes
587
4.4
700
States
55
5400
Grant Recipient
Local
Government
Total
2338
Budget
to
Grants
$B
There are approximately 200 Activity Worksheet Unique Ids in PAGE corresponding to
state/territorial grants, totaling over $750 million dollars. Thus, the average spending per PAGE
unique Id is about $3.5 million for these grants, compared with about $300,000 for the local
government grants. It will therefore be more effective to sample the state/territorial grants at the
finer level corresponding to specific activities within the sub-grants, rather than treating an entire
grant-level Activity, corresponding to a state program or sub-grant making function, as a unit for
selection. This is the reason we propose to define Activities within sub-grants as the unit of
selection for these types of grants.
An initial indication of a likely sample distribution is given in the final column of the table. This
column indicates the number of sample points that would go to each grantee type if the 350
sample points are allocated proportional to total budget. The final sample allocation will take a
number of other factors into account, as described below.
4.1.2
Frame Restrictions
4.1.2.1
BPAs Included
The RFP specifies that the focus of this evaluation will be on the six largest BPAs, which
together account for over 80 percent of the total program funding. Table 4-2 below indicates the
budget and number of Activity Worksheet Unique IDs in PAGE by BPA as of December 18,
2011. Based on these data, the six largest BPAs are the same as those indicated in the RFP.
Together, these account for about $2.3 billion, or 84 percent of the program budget.
Oak Ridge National Laboratory
35
February 9, 2012
Table 4-2: Budget and Activity Count by BPA per PAGE – 12/18/2011
BPA
BUDGET
UNIQUE
ACTIVITIES
Energy Efficiency Retrofits
Financial Incentive Program
Buildings and Facilities
Lighting
Onsite Renewable Technology
Energy Efficiency and Conservation Strategy
Transportation
Other
Technical Consultant Services
Residential and Commercial Buildings and Audits
Material Conservation Program
Energy Distribution
Reduction/Capture of Methane/Greenhouse Gases
Codes and Inspections
$ 1,061,789,136
$ 498,676,898
$ 265,774,085
$ 196,714,002
$ 169,129,015
$ 129,680,034
$ 117,174,520
$
77,257,567
$
67,992,028
$
64,102,333
$
32,003,942
$
30,787,364
$
29,968,657
$
18,347,452
2489
361
781
631
454
771
528
77
526
442
160
69
42
111
Grand Total
Total of Top 6
Top 6 as % of Total
$ 2,759,397,032
$ 2,321,763,170
84%
7442
5487
74%
As noted above, the Activities within sub-grants are not identifiable from PAGE. There is a BPA
defined for each grant-level Activity. Ad hoc review of some of the Project Descriptions for
state/territorial grants indicates that the Activities under the corresponding sub-grants, once that
detail is available, will likely be consistent with this grant-level BPA, at least in many cases.
However, when we review the specific activities within the sub-grants, we may find that some of
the Activities belong to a different BPA than the one indicated for the grant in PAGE. Thus, the
distribution of spending by BPA and possibly even the identity of the six largest BPAs may shift
after the Activities within sub-grants are identified.
Another consideration in defining what is included in the sampling frame is the nature of the
activities under BPA #1, “Energy Efficiency and Conservation Strategy,” particularly for the
state/territory grants. The Energy Efficiency and Conservation Strategy BPA differs from the
other BPAs offered in the EECBG program. As a condition for participation in EECBG, potential
grantees/sub-grantees were required to develop an energy efficiency and conservation strategy
plan for their municipality or tribe. Some strategies developed in this BPA did not necessarily
translate to direct energy savings but rather identified other EECBG BPAs or other energy
Oak Ridge National Laboratory
36
February 9, 2012
efficiency programs (e.g., State Energy Programs, utility sponsored energy efficiency programs,
etc) that should be pursued to achieve direct energy savings. Therefore, for Activities in which
energy efficiency measures were installed, will be included in the analysis of for BPA #1.
Activities with no direct energy savings that are related to program administration, such as hiring
an energy efficiency coordinator or costs for administering an energy efficiency program, will be
excluded from the analysis. However, Activities such as training and education programs that
may not have direct energy but rather indirectly lead to activities (e.g., building retrofits) that
produce energy savings, will be included in the analysis of BPA #1.
Another restriction for consideration is project completeness. We plan to evaluate projects that
are not necessarily complete at the time of selection or evaluation. However, it will be important
to give credit for savings only to projects that are likely to be completed. The RFP specifies that
the activity “must be far enough along so that its essential characteristics and operating
environment can be well understood.” This specification does not require that the project be
complete or nearly complete, but that it is sufficiently under way so that it is fully defined and
unlikely to be abandoned. Final determination of likelihood of completion will be part of the
evaluation of the selected activities.
4.2
Sampling in Waves
The RFP specifies that the sample will be stratified by expected time of completion, with
evaluations first conducted for the more complete activities, moving to the next group only when
the Activities in that group are sufficiently complete. Given the current timing of the sample
selection, we propose to consider a maximum of two waves. Thus, we will stratify Activities into
Early and Late, based on the time that Activity first reached a stage of being sufficiently
complete to be evaluated.
Activities sufficiently complete as of the first sample pull (Early stratum) can be selected at that
time, or at the second pull. Activities sufficiently complete only as of the second sample pull
(Late stratum) can only be selected at that time. For this reason, we will err on the side of
under-allocating sample to the first pull, because we can add sample points to the Early stratum
in the second pull if needed. If we over-allocate to the first pull initially, we will not have enough
sample points left to give an appropriate proportional allocation to the Late stratum in the
second pull.
Thus, if we sample in two waves, we will first identify all Activities that are sufficiently complete
for evaluation as of a specified cut-off date. This is the Early stratum. For the first wave, we will
select a sample of Activities from the Early stratum. The Late stratum will include all Activities
Oak Ridge National Laboratory
37
February 9, 2012
that were not sufficiently complete as of the Early cut-off date, but are sufficiently complete by
the time of the late cut-off date. For the second wave, we will select a sample from the Late
stratum, as well as selecting additional cases from the Early stratum.
It is anticipated that the sub-grantee detailed data will be available in PAGE at the end of the
first quarter of CY2012. Therefore it may be that the initial sampling wave will include only DOE
direct grantees rather than state grantees. We will design our sampling allocations across the
two waves to ensure that sub-grantees and larger projects (if applicable) are sufficiently
represented in the second sampling wave.
In general, larger projects take longer to plan and complete. For this reason, it is common in
many programs to find that earlier completed projects tend to be smaller than those completed
later. If this pattern holds for this program as well, we may find that the Early stratum involves
mostly smaller projects, and that most of the larger ones are in the Late stratum. For this
reason, it is particularly important not to over-allocate sample to the first wave, which will include
Activities only from the Early stratum.
4.3
Stratification
The RFP suggests stratifying the sample by BPA and within BPA by subcategories appropriate
to each BPA, such as technology type. However, the only subcategory information available in
PAGE is the Metric Activity. Each Activity Worksheet Unique ID in PAGE has a Broad Program
Activity (BPA), as well as one or more Metric Activities. The Metric Activity is an activity
category. The list of possible Metric Activities is the same as the list of possible BPAs. Thus,
the Metric Activity can be viewed as a “secondary” BPA. In some cases the Metric Activity is
the same as the BPA.
We will stratify Activities by state and sub-grantee BPA and Unique Activity Identification Code,
as defined by DOE. The table below4-3 indicates initial allocations to the cells defined by BPA
and Metric Activity, with allocation proportional to proposed EECBG budget.
Oak Ridge National Laboratory
38
February 9, 2012
Table 4-3: Initial Proportional to Size Allocation by BPA and Metric Activity
BPA/Metric Activity
Energy Efficiency Retrofits
Building Retrofits
Loans and Grants
Government, School, Institutional Procurement
Other
Industrial Process Efficiency
Building Energy Audits
Transportation
Clean Energy Policy
Financial Incentives and Rebates
Energy Efficiency Rating and Labeling
Renewable Energy Market Development
Building Codes and Standards
Workshops, Training, and Education
Technical Assistance
Financial Incentive Program
Loans and Grants
Financial Incentives and Rebates
Other
Building Retrofits
Clean Energy Policy
Technical Assistance
Building Energy Audits
Workshops, Training, and Education
Buildings and Facilities
Building Retrofits
Workshops, Training, and Education
Loans and Grants
Other
Building Energy Audits
Government, School, Institutional Procurement
Financial Incentives and Rebates
Clean Energy Policy
Industrial Process Efficiency
Renewable Energy Market Development
Technical Assistance
Energy Efficiency Rating and Labeling
Transportation
Building Codes and Standards
Lighting
Transportation
Government, School, Institutional Procurement
Building Retrofits
Other
Energy Efficiency Rating and Labeling
Industrial Process Efficiency
Renewable Energy Market Development
Loans and Grants
Clean Energy Policy
Building Codes and Standards
Workshops, Training, and Education
Technical Assistance
Financial Incentives and Rebates
Onsite Renewable Technology
Renewable Energy Market Development
Other
Clean Energy Policy
Government, School, Institutional Procurement
Building Retrofits
Industrial Process Efficiency
Building Energy Audits
Technical Assistance
Transportation
Loans and Grants
Workshops, Training, and Education
Energy Efficiency and Conservation Strategy
Loans and Grants
Clean Energy Policy
Other
Technical Assistance
Building Retrofits
Workshops, Training, and Education
Building Energy Audits
Transportation
Government, School, Institutional Procurement
Building Codes and Standards
Energy Efficiency Rating and Labeling
Renewable Energy Market Development
Proposed EECBG Budget Total
$
1,061,789,136
$
933,713,211
$
33,346,225
$
32,812,418
$
30,442,438
$
12,300,181
$
6,427,561
$
5,447,741
$
2,046,352
$
1,879,535
$
975,395
$
845,871
$
711,046
$
656,791
$
184,371
$
498,676,898
$
354,321,895
$
78,312,410
$
48,244,813
$
14,395,309
$
1,878,355
$
995,706
$
473,823
$
54,587
$
265,774,085
$
135,209,795
$
27,863,302
$
25,175,392
$
19,632,059
$
16,036,481
$
9,767,926
$
6,907,033
$
6,709,643
$
6,662,915
$
3,867,872
$
3,528,852
$
2,279,856
$
1,094,734
$
1,038,224
$
196,714,002
$
103,591,905
$
58,304,290
$
20,038,186
$
9,609,295
$
2,299,250
$
1,001,461
$
641,732
$
483,000
$
436,484
$
163,400
$
100,000
$
25,000
$
20,000
$
169,129,015
$
151,875,786
$
6,552,460
$
3,810,200
$
3,094,562
$
2,779,030
$
386,692
$
238,100
$
133,672
$
116,520
$
92,600
$
49,393
$
129,680,034
$
41,615,130
$
32,825,616
$
30,728,530
$
10,237,762
$
4,881,777
$
4,032,576
$
2,461,577
$
1,402,700
$
758,691
$
368,210
$
208,588
$
158,876
Grand Total
$
Oak Ridge National Laboratory
2,321,763,170
39
% of Frame
Budget
% of BPA Budget
46%
100%
40%
88%
1%
3%
1%
3%
1%
3%
1%
1%
0%
1%
0%
1%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
21%
100%
15%
71%
3%
16%
2%
10%
1%
3%
0%
0%
0%
0%
0%
0%
0%
0%
11%
100%
6%
51%
1%
10%
1%
9%
1%
7%
1%
6%
0%
4%
0%
3%
0%
3%
0%
3%
0%
1%
0%
1%
0%
1%
0%
0%
0%
0%
8%
100%
4%
53%
3%
30%
1%
10%
0%
5%
0%
1%
0%
1%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
7%
100%
7%
90%
0%
4%
0%
2%
0%
2%
0%
2%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
0%
6%
100%
2%
32%
1%
25%
1%
24%
0%
8%
0%
4%
0%
3%
0%
2%
0%
1%
0%
1%
0%
0%
0%
0%
0%
0%
100%
600%
Number of
Unique
Activities
2489
2209
18
111
36
36
20
15
5
6
10
7
5
9
2
361
216
113
6
15
2
3
3
3
781
265
209
11
42
72
52
8
41
12
16
30
9
4
10
631
258
222
96
28
9
2
6
3
3
1
1
1
1
454
381
13
14
10
24
2
2
3
2
1
2
771
12
413
130
111
26
34
26
4
5
2
2
6
Proportional to
Budget Sample
Allocation
160.1
140.8
5.0
4.9
4.6
1.9
1.0
0.8
0.3
0.3
0.1
0.1
0.1
0.1
0.0
75.2
53.4
11.8
7.3
2.2
0.3
0.2
0.1
0.0
40.1
20.4
4.2
3.8
3.0
2.4
1.5
1.0
1.0
1.0
0.6
0.5
0.3
0.2
0.2
29.7
15.6
8.8
3.0
1.4
0.3
0.2
0.1
0.1
0.1
0.0
0.0
0.0
0.0
25.5
22.9
1.0
0.6
0.5
0.4
0.1
0.0
0.0
0.0
0.0
0.0
19.5
6.3
4.9
4.6
1.5
0.7
0.6
0.4
0.2
0.1
0.1
0.0
0.0
5487
350
Rounded
Allocations
160
141
5
5
5
2
1
1
0
0
0
0
0
0
0
75
53
12
7
2
0
0
0
0
40
20
4
4
3
2
1
1
1
1
1
1
0
0
0
30
16
9
3
1
0
0
0
0
0
0
0
0
0
25
23
1
1
0
0
0
0
0
0
0
0
20
6
5
5
2
1
1
0
0
0
0
0
0
February 9, 2012
As the table indicates, some BPA-Metric Activity cells have less than one sample point
allocated; each such cell contains less than 0.15 percent of the proposed EECBG budget.
Based on current PAGE information, these small cells combined represent less than 1 percent
of the total frame. We recommend either collapsing these small cells (combining them with
other cells) or excluding them from the sampling frame. Collapsing cells with small allocations
(say 1 to 3 sample points) might also make sense.
A possible additional stratification variable is percent complete (as of the time the sample is
defined for each wave). Based on the percent complete, we may stratify into the following
categories such as:
Complete or substantially complete
Partly complete
Substantially incomplete
Not started
Activities that are not started or are insufficiently complete as of the time of sampling will be
excluded. Other dimensions that will be considered for stratification include size (proposed
budget), state, and grantee type. As described below, our proposed allocation and selection
procedures will ensure that the sample is allocated approximately proportional to size, and is
distributed across states and grantee types.
4.4
Allocation
Our general approach is to allocate samples proportional to size, with proposed EECBG budget
as the basic measure of size (MOS). Ideally, we would sample proportional to expected
savings, but that information is unavailable.
To give greater emphasis to certain types of activities, an adjusted measure of size can be used
that weights activities or cells with certain characteristics higher. For example, if we want to
include partly complete activities at half the sampling rate, we would create a modified measure
of size equal to proposed budget for substantially complete and complete activities, equal to half
the proposed budget for partly complete activities, and equal to zero for substantially incomplete
and incomplete activities.
We would then allocate sample proportional to this modified MOS. That is, we calculate the
total MOS by summing the MOS over all Activities in the sampling frame. For each sampling
Oak Ridge National Laboratory
40
February 9, 2012
cell, we determine what percentage of the total MOS is in that sampling cell. This is the
percentage of sample points allocated to that sampling cell.
The total sample of 350 implies that one Activity will be selected for approximately every 0.3
percent of the total MOS. That is, a cell with 1 percent of the total MOS will get an allocation of
3 sample points. A cell with 10 percent of the total MOS will have an allocation of 35 sample
points. The allocation procedure we will use follows these general principles, with some
refinements, as described below.
The first allocation step is to identify the “certainty” activities. These are the activities that by
themselves would get an allocation of at least one unit based on their size. Certainty activities
are identified in two rounds, as follows.
1. The initial certainty threshold is the total measure of size divided by total sample size.
Using the total proposed budget for the 6 included BPAs with no re-weighting, this ratio
gives 1 selection for each $21M of budget. All Activities with proposed budget above
this certainty threshold are included in the sample with certainty. Also, if a grant and
sub-grant that has multiple Activities with a budget above the certainty threshold, each
Activity above the threshold will be in sample with certainty. Therefore, it is possible that
multiple Activities within a grant or sub-grant will be included in the sample.
2. Recalculate a “2nd pass” certainty threshold as the remaining measure of size divided by
the remaining sample count. All Activities with measure of size above this second pass
certainty threshold are included with certainty.
After the largest activities are pulled out in the certainty sample, the remaining sample is
allocated to the remaining Activities proportional to size. This allocation will be done across the
entire sample frame resulting with a BPA that having a high initial allocation due to a few very
large Activities would have a smaller final allocation after the certainty Activities are pulled out.
Figure 4-1 below illustrates how the certainty allocation falls out for the current PAGE data,
without identifying the Activities within the sub-grants. That is, in this illustration, each
state/territory Worksheet Activity Unique ID that is a sub-grant distribution process is included
as a single, very large Unique Activity. Most of these would end up as certainty selections. The
column “sum of certainty allocation” indicates the total probability proportion to size (pps)
allocation that would in principle go to the large activities. However, we allow each Activity to be
selected only once. Thus, while the total size-based allocation to the certainty activities would
Oak Ridge National Laboratory
41
February 9, 2012
be 82, there are only 41 such activities. The rest of the allocation to these activities is included
in the BPA remainder allocations.
Figure 4-1: Certainty Allocations11
BPAs
Local Government and Tribes Grants
Energy Efficiency Retrofits
Financial Incentive Program
Buildings and Facilities
Lighting
Onsite Renewable Technology
Energy Efficiency and Conservation Strategy
State Grants
Financial Incentive Program
Energy Efficiency Retrofits
Energy Efficiency and Conservation Strategy
Buildings and Facilities
Onsite Renewable Technology
Lighting
Grand Total
Sum of Proposed EECBG Count of Activity Worksheet Sum of sample
Budget
Unique ID
points allocated
Sum of certainty
allocation
Sum of # certainty
activities
$
$
$
$
$
$
938,150,531
236,945,115
235,731,704
189,150,145
159,693,424
75,839,369
2461
325
764
624
447
752
141.4
35.7
35.5
28.5
24.1
11.4
18.4
13.2
5.1
1.1
0.0
0.0
10
6
3
1
0
0
$
$
$
$
$
$
$
261,731,783
123,638,605
53,840,665
30,042,381
9,435,591
7,563,857
2,321,763,170
36
28
19
17
7
7
5487
39.5
18.6
8.1
4.5
1.4
1.1
350.0
29.4
7.4
5.2
2.5
0.0
0.0
82.4
11
5
3
2
0
0
41
With the identification of sub-grant Activities within the state/territory grant-level activities, these
very large activities will be broken up into smaller pieces. As a result, there can be multiple
selections of Activities within a single state/territory grant Activity. This process will provide a
better allocation of resources.
The allocation of the non-certainty sample will use a form of systematic proportional to size
sampling based on Chromy’s method. The procedure is illustrated in the figure below, for a
simplified example with a total measure of size equal to 24, a sample size of 4, and each unit
having a size or 1, 2, 3, or 4.
11
Currently, we have only an approximate indicator for distinguishing grants to states for distribution to
sub-grantees. KEMA anticipates obtaining a more definitive indicator from DOE.
Oak Ridge National Laboratory
42
February 9, 2012
Figure 4-2: Illustration of Systematic Proportional Sampling
State
A
Ultimate
Activity
Grant
ID
1
11
Size
3
Cum Size
3
Selection #
A
A
B
1
2
1
12
21
31
1
1
4
4
5
9
#3
B
2
32
3
12
#4
B
3
33
3
15
B
C
4
1
34
41
1
3
16
19
C
2
51
2
21
C
C
C
2
3
3
52
61
62
1
1
1
22
23
24
Total of Size
Sample size
Size increment
START
#1
#2
24
4
6
Sort the Activities by BPA, State, and Grant number (or other factors we want to distribute the
sample over systematically).
1. Beginning at the top of the sorted list, calculate the cumulative MOS for each Activity in
the list.
2. Calculate the selection size increment as the total remainder MOS divided by the total
remainder sample size.
3. Select a random start point between 1 and the total MOS. The first selection is the first
Activity in the list whose cumulative size is greater than that random start point. This is
the Activity the random start point “lands in.”
Oak Ridge National Laboratory
43
February 9, 2012
4. The next Activity selected is the one that is “hit” by adding the selection size increment
(from Step 2) to the previous selection point. That is, the cumulative measure of size for
the next selection point is the previous selection point’s cumulative measure of size plus
the selection size increment. If this cumulative measure of size value is greater than the
total measure of size, wrap the count around to the beginning. The corresponding
selection is the first Activity on the list with cumulative measure of size greater than this.
5. Repeat step 4 until all selections are made.
This procedure ensures that an individual state or grant will be allocated selections within +/- 1
of its proportional allocation. In fact, the number of selections will be the (possibly non-integer)
expected number rounded either up or down.
For example, in the illustration, the sample size is 4, and the total measure of size is 24. The
sampling increment is therefore 24/4 = 6. The sampling rate is 1 unit for every cumulative size
measure of 6, or 1/6 of a selection per size unit. State A has a total size of 5, so it’s expected
number of selections is 5/6. In the example, state A has 1 selection. State A cannot have more
than 1 selection because the increment of 6 units cannot fit within State A. Once State A has
been hit once, the next increment will go beyond State A. State A could end up with 0
selections, if the selection increments hit just before and just after it. Therefore, the possible
number of selections for State A is 0 or 1.
State C on the other hand has a total size of 8. In the illustration it was selected once.
However, with an increment of 6, State C could have been hit twice. State C has to be hit at
least once, because its size is bigger than the selection increment. The expected number of
selections for State C is 8/6 = 1.3. The possible number of selections for State C is 1 or 2.
To control the sample size by BPA exactly, we will apply this method separately for each BPA.
Within BPA, we can sort by Metric Activity, State, Grant, and Grantee type. We plan to sort first
by State and Grant, so that each of these will have allocations within +/- 1 of their expected
number within the BPA. However, there may randomly be over- or under-allocation to Metric
Activity or Grantee Type within the BPA.
Oak Ridge National Laboratory
44
February 9, 2012
4.5
Oversampling
The sample drawn will be greater than the target of 350 completes, to allow for cases where it is
not possible to evaluate a selected Activity due to lack of available or cooperative respondents,
or other data limitations. We will work with DOE to make every practical effort to secure a
response for the primary selections before moving to a substitute. We will provide explicit rules
for how substitutes will be chosen.
Oak Ridge National Laboratory
45
February 9, 2012
5.
Review of EECBG Databases
5.1
Creation of Review Packages
Each sampled grant Activity will have several background documents associated with it,
including:
Grant application
Quarterly reports, with data incorporated into PAGE
Reports to OMB, with data incorporated in OMB database
Status reports to State and Regional offices, with data incorporated into GRASS
A set of Activities will be assigned to individual reviewers, with associated hard copy reports
plus links to the various data sources for each grant. The combined information constitutes a
grant review package.
5.2
Review Procedure
The initial review will occur prior to the administration of the surveys. This phase will proceed as
follows:
1. Review all material associated with the selected Activity to determine what project(s)
was performed, what types of buildings or facilities was involved and other aspects of
the Activity.
2. Determine what portion of the grant the selected Activity represents, and obtain a
general description of the other grant components.
3. Prepare a Case Study description of the Activity that will serve as the background “story”
of the Activity.
4. Create an individual database of information concerning the specific Activity by
extracting key data elements from PAGE, GRASS, OMB and any other sources of data
on specific actions taken to save energy in buildings or facilities. Carefully provide
Oak Ridge National Laboratory
46
February 9, 2012
source notations as to the origination of each data point for future verification. 12 Note
that a template/format will be developed for the Activity-Level Evaluation Data File for
each BPA to ensure specificity for evaluation of that program type, comprehensiveness
and consistency across Activities within each BPA.
5. Populate the survey instruments as follows using information collected from the reviews
of the various EECBG databases.
a. Survey 1: DOE Program Officers, Regional and State Coordinators:
i. Name and contact info for the respondent
ii.
iii.
iv.
v.
vi.
vii.
Survey appointment date and time
Interviewer name
Grant title and recipient
Activity BPA category
Activity description
Grant amount
viii. Activity amount or budget
ix. Status of the Activity and Grant (descriptive and % complete)
x. Notes concerning any special issues or questions that have arisen from
prior research and review of the EECBG databases
b. Survey 2: Grant Activity Manager Survey
i. Items i-ix listed in Survey 1
ii. Insert data from PAGE into each verification question in survey
instrument
iii. For those questions that cannot be populated from PAGE, indicate
whether the value is missing or of questionable quality
c. Survey 3: Performance Factor Survey
i. Items i-ix listed in Survey 1
12
For example, insert a comment in each cell using Excel comment feature, or create a separate field for
source notes within the database.
Oak Ridge National Laboratory
47
February 9, 2012
5.3
Post-Survey Review
After the new data are collected from the three surveys, the person conducting the will enter the
survey data into the Activity-Level Evaluation Data File and update all Activity information. This
phase will proceed with the following three final steps required before analysis:
1. Review for completeness and potential accuracy of information for assessing energy
savings and other outcomes for the Activity.
2. Identify any questions or inconsistencies or remaining gaps in the available Activity Data.
3. If such gaps or questions remain, schedule a follow-up call with the respondent to fill
gaps and address issues (See Section 6.0).
Oak Ridge National Laboratory
48
February 9, 2012
6.
Survey Implementation
6.1
Population of Surveys with Data from Review of EECBG
Databases
The review of the EECBG databases will culminate in having each assigned reviewer populate
the appropriate Grant Activity Manager Survey (Survey #2A or #2B) with data concerning the
specific Activity sampled. The purpose of this is to create a customized survey for each
sampled Activity so that information can be quickly verified or collected while on the telephone.
The other two surveys – Survey 1 and 3 – will similarly be prepared with the contact person’s
name, phone number, grant amount, BPA and Activity description so that the telephone
interviewer can refer back to the specific projects undertaken during the call.
Figure 6-1: Survey Instrument Summary
EECBG Survey Instrument Summary
Survey #1
Target: DOE Project
Officers; DOE
Technical Assistance
State and Regional
Coordinators
Purpose: Gain
understanding of grant
details, project
activity, and
performance. Identify
Grantee Contact for
further surveys.
6.2
Survey #2
Survey #3
Target: Grant
Managers (Program
Activity Level)
Target: Grant
Managers (Program
Level)
Purpose: Obtain
Program area and
Activity level detail.
Purpose: Identify
factors with the
potential to influence
performance.
Survey Training
KEMA will conduct training on the survey instruments with all team members who are
conducting the calls. The survey instruments will be tested for timing and logic of the skip
Oak Ridge National Laboratory
49
February 9, 2012
patterns. A survey dictionary will be developed to assist the interviewers with technical terms
and provide definitions for respondents who may have questions.
6.3
Conduct Survey #1 – DOE Program Officers Interviews
Senior KEMA team members will conduct the interviews with DOE Program DOE Program
Officers, State and Regional Coordinators. We will employ professionals who have experience
in conducting process evaluation interviews and have the technical capability to probe issues
that may arise during the discussion. A key factor in these interviews will be the identification of
the best person at the Activity Level of the grant selected to participate in the Grant Activity
Manager Survey.
To schedule the calls, KEMA may request a cover letter email be sent by DOE introducing
KEMA and the purpose of the calls so that full cooperation can be encouraged. We will
schedule the calls at the respondent’s convenience in anticipation of a 30 to 40 minute contact.
6.4
Conduct Survey # 2 – Local Grantee Grant Activity
Manager Survey
These surveys will be performed by trained CATI telephone interview staff experienced in
conducting energy efficiency program related studies. Each will undergo extensive training on
the instrument so that they fully understand the skip patterns and response choices involved
with each question. The survey instrument is very detailed with response choices, which will
enable non-engineers to conduct this phase of the research in an efficient and cost-effective
manner.
Once the surveys are concluded, the individual engineering staff (desk reviewer) will review
each of the results for comprehensiveness and adequacy of data for the purpose of estimating
energy savings.
6.5
Conduct Survey #3 – Performance Factors Survey
The Performance Factors Survey will be conducted by trained CATI staff using a programmed
instrument. At the same time, there will be ample opportunities for open-ended questions so
that any unanticipated factors and/or issues can be captured through the conversations.
Oak Ridge National Laboratory
50
February 9, 2012
6.6
Follow-Up Calls
Follow-up calls will be conducted by the engineers conducting the desk reviews, as these
individuals possess the engineering knowledge of the technologies and building science that will
enable them to probe for any gaps in the data. These follow-up calls will be made when needed
to fill out the information necessary, and will focus upon detailed information unique to the
specific Activity.
6.7
Prepare Survey Results Database
Responses from Survey #1 will be entered into a spreadsheet database for analysis. This
approach will be used due to the more conversational nature of these calls, their shorter
duration and the small sample size involved (55 DOE Program Officers and 17 regional and
state support staff).
Responses from Survey #2 will be captured electronically during the telephone interview, with
no manual data entry involved. We anticipate using a Computer Aided Telephone Interview
(CATI) system that will automatically store responses to questions into a database for later
analysis. Due to the complexity of the survey instrument and the potential need to cycle back
to sections for addressing multiple buildings and/or multiple measures, a CATI system is most
appropriate. This system also provides for frequent status reporting on the survey disposition
so that KEMA can include weekly reports on survey status while calls are in the field. The desk
reviewers responsible for evaluating each Activity will download the relevant data from the
survey results into their Activity –Level Data File.
Responses from Survey #3 will also be captured via the CATI system, so that the data can be
appended to responses from each of the 350 sampled grants. As above, the desk reviewers
responsible for each Activity will download the relevant data from the survey results into their
Activity –Level Data File.
Oak Ridge National Laboratory
51
February 9, 2012
7.
Estimation of Energy Savings
This section provides guidelines for estimating the energy impacts for the program activities to
be evaluated. The energy impacts referred to in this section correspond in concept to “gross
savings” as that term is commonly used in evaluation of rate payer-funded energy efficiency
programs. Evaluation team members charged with managing each of the sample program
Activity evaluations (the Lead Evaluators) will prepare detailed evaluation plans that take into
account each sample program activities’ actual operations, scale, organization, roster of
services provided, and level of documentation.
7.1
Introduction
The evaluation of the EECBG program will be based upon the integration of data from each key
data source, as described in Chapter 2. While the DOE databases, particularly PAGE, may
provide information and data for the specific Activity, it is the GAMS surveys together with the
follow-up calls, that will provide the data required for the central part of the evaluation. The
review of Activity data must be conducted by individuals with training in building and technology
sciences, who have the capacity to determine the engineering logic of the information provided,
and whether adequate data exist to be able to develop a reasonable estimate of energy
savings.
To that aim, the evaluation team has assigned a team of engineers with experience in impact
evaluation to research a portfolio or set of Grant Activities from “cradle to grave”- from the
application paperwork through to the grant close-out information being collected by DOE (see
discussion of Grant Close-Out Tagging Process in Section 3.0). Having each Activity
researched by one person will provide continuity through the study tasks and will result in a
higher quality of insights based on the accumulated knowledge of that individual concerning that
Activity. The reporting will consist of a Case Study description of the grant and Activity, along
with results of the data collection and final energy savings results – all performed by the same
individual closest to the information.
Quality control will be exercised by cross-reviewing within the team by equally (if not more)
experienced engineering and impact evaluation Subject Matter Experts. Each desk reviewer
will be responsible for defending their information, their conclusions and their recommendations
concerning each of the Activities in their portfolio.
Oak Ridge National Laboratory
52
February 9, 2012
7.2
Groupings of Programs for Energy Impact Assessment
Planning
The energy impact analysis will focus on six BPAs:
Energy Efficiency Retrofits
Financial Incentives Programs
Building and Facilities
Lighting
On-site Renewable Technology
Energy Efficiency and Conservation Strategy
The six BPAs can be grouped by type of Activity performed in the program, which will permit the
application of common and consistent impact analyses on like measures. Specifically, for the
purposes of energy impact analysis, the six BPAs can be categorized into three basic groups of
program activities.
1. Building Retrofit and Equipment Replacement. For this group, the basic energy
savings mechanism involves the implementation of energy-savings capital projects or
the installation of energy-efficient equipment in existing residential, commercial, and
industrial facilities. Estimation of energy savings generally requires the following steps:
a. Review and validation of program records to ensure that they capture and
characterize accurately the capital improvements or efficient equipment
installations supported by the program. The data sources for this analysis will
include the PAGE data base, DOE’s Golden data base and GRASS.
b. Verification of the measure installation and operation. The data collected through
the grant Activity survey will provide verification of the installation and operation
of measures installed in the Activity.
c. Expansion of sample findings to the population of projects, usually through the
application of ratio estimation.
BPAs included: Energy Efficiency Retrofits, Financial Incentives Programs, Building and
Facilities and Lighting
Oak Ridge National Laboratory
53
February 9, 2012
2. On-Site Renewable Technology. For this group, the energy savings mechanism
involves the production and delivery of energy using renewable technologies that would
otherwise have been produced by conventional fuels including: petroleum products,
natural gas, nuclear power, or coal. The types of technologies in this BPA are uniquely
different than the retrofit and replacement of equipment and should be analyzed as a
separate group. However, while the type of data collected for on-site renewable
technologies (e.g., capacity, generation patterns, etc.) is different, the estimation of
energy savings will use an approach similar to the analysis of savings from building
retrofits:
a. Review and validation of program records to ensure that they capture and
characterize accurately the renewable energy equipment installations made with
program support and verification of installation and operation.
b. Expansion of sample findings to the population of projects, usually through the
application of ratio estimation, with appropriate segmentation by renewable
energy system type and size.
BPA included: On-site Renewable Technology
3.
Energy Efficiency and Conservation Strategy. The Energy Efficiency and
Conservation Strategy BPA is distinct from the other BPAs offered in the EECBG
program. As a condition for participation in EECBG, potential grantees/sub-grantees
were required to develop an Energy Efficiency and Conservation Strategy plan for their
state/territory, municipality or tribe. The objective of the plan was to ensure that
recipients developed a forward-looking framework to identify and capture energy saving
opportunities and associated benefits such as job growth and environmental benefits.
Some strategies developed in this BPA do not necessarily translate to direct energy
savings but rather identified other EECBG BPAs or other energy efficiency programs
(e.g., State Energy Programs, utility sponsored energy efficiency programs, etc) that
should be pursued to achieve direct energy savings. Therefore, the energy impact
analysis approach for this BPA differs from the other two program groupings described
above. Specifically, the diversity of Activity types in this BPA will require a two-prong
approach.
Direct energy savings. For activities in this BPA in which energy efficiency measures
were installed (direct energy savings), the energy impact analysis will follow the
process described above for the Building Retrofit and Equipment Replacement
Oak Ridge National Laboratory
54
February 9, 2012
grantees/sub-grantees or the On-site Renewable Technology category, as
appropriate.
No direct energy savings. For activities in this BPA with no direct energy savings that
are related to program administration, such as hiring an energy efficiency coordinator
or costs for administering an energy efficiency program, the activity will be excluded
from the energy impact analysis. However, for training and education programs in
this BPA, with no direct energy savings, the activity will be included the analysis. .
BPA included: Energy efficiency and Conservation Strategy
7.3
Evaluation Plans: Building Retrofit and Equipment
Replacement
7.3.1
Introduction
Based on the review of the EECBG PAGE database and information gained from work on statelevel EECBG Program Activity evaluations, we have determined that many of the program
activities in the Building Retrofit and Equipment Replacement group are retrofit projects
involving the early replacement of functioning equipment and building systems with energy
efficient models.
Lifetime energy savings is one of the key evaluation metrics for this evaluation. In order to
estimate savings from a retrofit project fairly and accurately, it is necessary to determine or to
provide clear and reasonable assumptions regarding how long the facility owner would have
kept the pre-existing equipment in place in the absence of program assistance to replace it.
The example depicted in Figure 7-1 illustrates the importance of this methodological issue. The
solid horizontal lines show the annual energy consumption for a large, durable piece of
equipment, such as a chiller, at three levels of efficiency: the equipment in place, the current
standard or baseline efficiency for new equipment, and the most efficient equipment available.
Assume the program participant installs a new chiller with the highest available efficiency, and
that the program induced him to do so four years before he would have in the absence of the
program. We refer to the period between the program-induced improvements and the
(hypothetical) date when they would otherwise have occurred as the “acceleration period.”
During the acceleration period, energy savings would be represented by the shaded area
labeled “Energy Savings during the Acceleration Period”. After year four, the relevant efficiency
Oak Ridge National Laboratory
55
February 9, 2012
improvement is represented by the distance between the “Current Baseline” and “Efficient”
annual consumption levels. So, from year four to the end of the equipment’s useful life, the total
savings are represented by the shaded area labeled “Energy Savings after the Acceleration
Period.” If we had simply projected the savings during the acceleration period to the entire
useful life, lifetime energy savings would be much greater, as represented by the rectangle
bounded by points a, b, c, and d.
Figure 7-1: Representation of Energy Savings from Retrofit
a
b
d
c
In assessing the length of the acceleration and post-installation periods for individual projects or
groups of projects, we will take the following into consideration:
Studies of persistence of measures in the field undertaken for public benefits program
sponsors
Databases of measurement performance such as California’s Database of Energy
Efficiency Resources (DEER) and Technical Resource Manuals that were developed for
other program sponsors
Knowledge of the facility management and investment practices of key owner segments.
For example, in our own practice we often find that government agencies, operating
under budget constraints, retain major heating, mechanical, lighting, and control systems
Oak Ridge National Laboratory
56
February 9, 2012
in place well beyond their rated useful lives. Conversely, in retail and office space,
lighting systems are replaced frequently with changes in occupancy and mechanical
system adjusted to accommodate occupancy needs.
Surveys and follow-up telephone interviews with grantee and sub-grantee project managers will
be used to assess the extent to which program assistance accelerated replacement of the
equipment in question.
7.3.2
Development of the Savings Calculation Tool (SCT)
We will undertake evaluations of 350 program activities. In order to ensure consistency of
evaluation methods across each Activity, transparency of procedures, replicability of results,
and an auditable trail for quality control, we will develop a Savings Calculation Tool (SCT) that
will be used by all Lead Evaluators.
For all evaluations of the activities included in the building retrofit and replacement measures, it
will be necessary to develop engineering-based (TRM type) estimates of savings for the diverse
sample of activities. The quality of these estimates will be equivalent to the reliability provided
by ex ante projections of savings typical of energy efficiency Technical Reference Manuals such
as the New York TRM. The program activities in the Building Retrofit and Equipment
Replacement category support a broad range of measures in the full spectrum of residential and
non-residential end-uses. Moreover, they operate in a wide variety of climate zones and in
states characterized by large variations in baseline efficiency, as shaped by levels of code
adoption and customary building practice. We know from preliminary work that the PAGE
tracking database for EECBG programs may vary in terms of content and level of detail for
some activities. For example, PAGE may contain information on square footage of the space
associated with the measures implemented in the Activity but may have no other measures of
scale, such as counts of units installed. Other entries contain information on project cost, but no
other measures of size. Our engineering calculations will need to make the best use of survey
data that verify and supplement the PAGE data and follow-up interviews with grant and subgrantee Activity project managers. We will develop procedures and tools to maintain as much
consistency and transparency as possible in the savings analysis.
As mentioned above, we will develop for ORNL/DOE an SCT to meet these needs. The SCT
will be developed in Microsoft Access or Excel and a separate copy populated with local data for
each Activity evaluation. We will, to the extent possible, leverage the work undertaken to
develop the SCT for the National Evaluation of the State Evaluation Program (SEP) currently
being performed by KEMA. We will exploit opportunities to mirror the structure of the SEP SCT
Oak Ridge National Laboratory
57
February 9, 2012
for activities that are similar between the two programs. For activities that are unique to the
EECBG program, we will expand the SCT to incorporate the modeling parameters required to
conduct the energy savings calculations.
We anticipate that the EECBG SCT will consist of the following components.:
o
Savings Algorithm Library. This portion of the tool will contain savings calculation
algorithms for the full range of common energy savings measures in the building
segments. For weather-sensitive measures such as HVAC improvements, the
algorithms will include formulae and procedures for taking local weather conditions
into account, including specification of the local weather data required. These
algorithms will be based on similar work contained in Technical Resource Manuals,
as well as our own engineering experience. The sources of all algorithms will be fully
documented in this portion of the tool.
o
Input parameter assumption library. This portion of the tool will contain input
parameter assumptions used in the algorithms. For some, these will be engineering
constants, such as the conversion of motor horsepower to kW or efficiency curves
used to estimate savings from VFDs. Others, such as coincidence factors, hours of
use for lighting, and heating and cooling degree days will need to be localized to
regions, states, or climate zones as appropriate. Finally, this library will contain the
“acceleration period” matrices discussed above.
o
Input parameter estimates. This portion of the tool will contain the input parameter
estimates actually used in the evaluation of a given Activity. These will be estimated
through verification activities (e.g., telephone surveys with the grant/sub-grantee
project managers) of the Activity together with the input parameter assumption
library.
o
Tracking database file. The tracking database will be copied, moved, or data
entered into a flat file in the SCT for use in developing ex ante estimates of savings
at the individual Activity level of aggregation.
o
Ex ante savings file. This portion of the tool will contain the results of the ex ante
savings calculations at the lowest level of aggregation supported by the input data.
From these results, we should be able to calculate statistics such as savings per
project or per unit of various measures that can be used to test the plausibility of
estimates and to assess the accuracy of the input data.
Oak Ridge National Laboratory
58
February 9, 2012
o
Verification data file. This portion of the tool will contain the cleaned raw data from
the data collection on the verification sample that was done by telephone.
o
Verified savings file. This file will contain the results of the estimations of verified
savings for each sample site. To the extent possible the calculations of verified
savings will be stored with the individual site records on this file. For instances in
which the calculations are too complex or customized, this file will contain references
to work papers and free-standing spreadsheet files.
o
Ratio estimation and sample expansion file. Where ratio estimation is used, this
sheet will contain the output of calculations which KEMA generally implements in a
statistical package such as SAS. This sheet will also contain the calculations by
which the sample data are expanded to the population.
o
Energy savings summary file. This sheet will contain the principal results of the
savings analysis, including average annual energy savings, lifetime energy savings,
and average peak demand reductions. This sheet may also contain areas for
calculations that are driven by energy savings estimates, such as energy cost
savings and emission reductions.
o
Cost benefit inputs file. This file will contain the inputs needed for cost benefit
analysis and other economic characterizations of the program, including program
expenditures, developed in consultation with ORNL/DOE.
Once the tool is created, the Lead Evaluator for the Activity evaluation will be responsible for
populating it. We will store the current versions of each tool on a central server where senior
evaluation managers can access them for quality control checks and to verify progress.
7.3.3
Energy Impacts Assessment Approach
The impact analysis will be comprised of the following sub-tasks:
Assessment of evaluability
Verification data collection and analysis
Calculation of energy savings estimates
Expansion of sample results to the population of participants.
Oak Ridge National Laboratory
59
February 9, 2012
7.3.3.1
Assessment of Evaluability
The objectives of this task are to determine whether it will be possible to evaluate the sampled
Activity. The Lead Evaluator for the Activity will be responsible for collecting information on the
criteria listed below and for submitting to the KEMA Project Manager an evaluability assessment
within two weeks of initiation of the Activity analysis. The criteria to be applied in assessing
evaluability of sampled activities in this group will include the following:
Progress in Activity implementation. In order to be considered for evaluation, the
program Activity needs meet the following implementation milestones:
o
Received and approved applications and completed contract agreements for
loans or grants (or other applicable incentives) from eligible participants
o
The Activity is currently active, and is not at risk of cancellation or movement of
significant funding to a different BPA
o
Documentation and reporting data are sufficient to conduct the evaluation
Quality and availability of program records. At a minimum, evaluation will require an
indicator of the kinds of services and/or incentives received. The Lead Evaluator will
make an assessment as to whether:
o
All or nearly all of the Activity data are included in PAGE, other EECBG-related
databases or available on paper in the grantee or sub-grantee file. Such
information would include types of measures installed, end-uses addressed,
quantity, efficiency rating, and installed capacity of equipment installed, project
costs, and savings estimates developed by other organizations.
o
The Lead Evaluator will assess the quality and completeness of the data fields to
determine if they provide what is required by the Savings Calculation Tool (SCT)
to develop consistent ex ante estimates of savings for each Activity. For larger
and more complex projects, review of the project files may be required to ensure
that ex ante estimates are reasonable. We anticipate needing to supplement
tracking system data with information gained from paper files and questioning of
program staff in some cases.
Deliverables. The deliverable for this task will be a memorandum summarizing the Lead
Evaluator’s findings in regard to the criteria listed above and a recommendation regarding the
retention of the Activity in the evaluation sample.
Oak Ridge National Laboratory
60
February 9, 2012
7.3.3.2
Measurement and Verification Data Collection and Analysis
The objectives of this task are to develop verified estimates of energy savings for all the Building
Retrofit and Equipment Replacement activities selected for the evaluation. The measurement
and verification of savings will be accomplished through telephone verification interviews.
Telephone verification interviews with grant/sub-grantee project managers will validate or
update information on the type, quantity, and capacity of equipment measures installed with
program support.
After completing the review of the Activity data and documentation and the verification data, the
next step will be to calculate the energy savings. The key components of this task include:
Determine the appropriate baseline conditions
o
Normal pre-EECBG baseline: The energy savings for retrofits is based on either
the pre-existing conditions or on a minimally code compliant replacement that
has not been influenced by EECBG.
o
Dual baseline: In the case where the existing equipment was not ready for
replacement but was replaced to improve energy efficiency, the remaining useful
life of the equipment is considered. The first baseline, the early replacement
baseline, uses the energy consumption of the preexisting equipment for the
remaining useful life. The second baseline, the normal replacement pre-EECBG
baseline, applies after the remaining useful life of the equipment until the
estimated end of the measure life.
Perform engineering calculations to determine the gross savings achieved. The gross
site savings will be calculated by taking the difference between energy usage for the
measure-treated usage and the appropriate pre-EECBG baseline. The engineer
combines data from the following sources to estimate savings: participant survey
interviews, including hours of operation, seasonal patterns of use, control schemes;
equipment specifications and invoices; engineering best practices and reference data.
Deliverables. The deliverables for this task will be as follows:
Verification data file populated with data collected through telephone interviews for each
sample project.
Oak Ridge National Laboratory
61
February 9, 2012
Verified savings file populated with the verified savings estimate for each sample site.
This file will also contain references to algorithms and assumptions used from the
libraries included in the Savings Calculation Tool, and to external spreadsheets that
contain the savings calculations for more complex measures.
Work papers consisting of savings calculation spreadsheets and scans of paper records,
such as manufacturers’ cut sheets used in developing savings estimates for complex
measures.
7.3.3.3
Expansion of sample savings estimates
The final step will be to expand the findings of verified savings for the sample of Building Retrofit
and Equipment Replacement activities to the BPA levels. The data analyses performed for all
sampled Activities will produce a set of savings estimates that are adjusted to reflect the actual
quantity, efficiency features, operating environment, and operating patterns of the measures
installed. We will use ratio estimation techniques to process these Activity-specific estimates of
savings, per dollar, along with information on expenditures for the entire population into an
estimate of adjusted gross savings for each of the six BPAs included in the EECBG evaluation.
The calculation of the adjustment factors for preliminary savings estimates uses appropriate
weights corresponding to the sampling rate within each stratum. The extrapolation of Activity
savings to BPA level will be based upon the following calculation:
BPA Level Savings = ∑(sampling weight for stratumi) x
∑ (savings for projectj in stratumi /grant expenditure for project j)
Deliverables. The deliverables for this task include:
Ratio estimation and sample expansion file populated with the results of the sample
expansion calculations, which will include total energy savings for the Activity and for the
BPA. These findings may be used to help refine savings parameters used in the
Savings Calculator.
Energy savings summary file. This sheet will contain the principal results of the savings
analysis, including average annual energy savings, lifetime energy savings, and average
peak demand reductions.
Oak Ridge National Laboratory
62
February 9, 2012
7.4
On-site Renewable Technology Program
7.4.1
Introduction
The On-site Renewable Technology Program focuses on the development of customer-sited
equipment. The energy impact assessments will be based on estimation of renewable energy
generation and capacity for a sample of installations, and expansion of those estimates to the
relevant population of installations using various statistical approaches. The savings for all
selected activities will be verified via remote methods, including telephone interviews with
project principals and review of project specifications and energy production records, to the
extent those are available.
7.4.2
Assessment of Evaluability
The objectives of this task are to determine whether it will be possible to evaluate the sampled
Activity. The Lead Evaluator for the Activity will be responsible for collecting information on the
criteria listed below and for submitting to the KEMA Project Manager an evaluability assessment
within two weeks of initiation of the Activity analysis. The criteria to be applied in assessing
evaluability of sampled activities in this group will include the following:
Progress in Activity implementation. In order to be considered for evaluation, the
program Activity needs to meet the following implementation milestones:
o
Received and approved applications, and completed contract agreements for
loans or grants (or other applicable incentives) from eligible participants. The
Activity is currently active, and is not at risk of cancellation or movement of
significant funding to a different BPA
o
Documentation and reporting data are sufficient to conduct the evaluation
Quality and availability of program records. At a minimum, evaluation will require some
indicator of the kinds of services and/or incentives received. The Lead Evaluator will
make an assessment as to whether:
o All or nearly all of the Activity data are included in PAGE, other EECBG-related
databases or available on paper in the grantee or sub-grantee file. Such
information would include types of measures installed, end-uses addressed,
quantity, efficiency rating, and installed capacity of equipment installed, project
costs, and savings estimates developed by other organizations.
Oak Ridge National Laboratory
63
February 9, 2012
o
The Lead Evaluator will assess the quality and completeness of the data fields to
determine if they provide what is required by the Savings Calculation Tool (SCT)
to develop consistent ex ante estimates of savings for each Activity. For larger
and more complex projects, review of the project files may be required to ensure
that ex ante estimates are reasonable. We anticipate needing to supplement
tracking system data with information gained from paper files and questioning of
program staff in many cases.
7.4.3
Verification Data Collection and Analysis
The objective of this task is to develop verified estimates of renewable energy generated for onsite renewable activities selected for the evaluation. Verification information will be collected
only through remote activities, including file review and interviews with project owners and
operators.
7.4.4
Expansion of Sample Savings Estimates to the Population of
Activities
The sample expansion procedures to be used in the evaluation of the activities in this group are
the same as those described above in Section 7.3.3.3 for the Building Retrofit and Equipment
Replacement group.
7.5
Energy Efficiency and Conservation Strategy
7.5.1
Assessment of Evaluability
The criteria to be applied in assessing evaluability of the activities in the Energy Efficiency and
Conservation Strategy BPA will include the following:
Progress in Activity implementation. In order to be considered for evaluation, the
program Activity needs to have met the following implementation milestones:
o
The development of the energy efficiency and conservation strategy is either
currently active or completed, and is not at risk of cancellation or movement of
significant funding to a different BPA.
o
Documentation and reporting data are sufficient to conduct the evaluation.
Oak Ridge National Laboratory
64
February 9, 2012
Quality and availability of program records. At a minimum, an indicator of the kinds of
activities resulting from the strategy (e.g., recommended energy efficiency activities to
pursue, creation of an energy efficiency office/manager). The Lead Evaluator will make
an assessment as to whether:
o
A sufficient level of Activity data are included in PAGE, other EECBG-related
data bases or available on paper in the grantee or sub-grantee file to .allow
savings to be calculated. Such information could include types of measures
installed, end-uses addressed, quantity, efficiency rating, and installed capacity
of equipment installed, project costs, and savings estimates developed by other
organizations.
o
7.5.2
The Lead Evaluator will assess the quality and completeness of the data fields to
determine the extent required by the Savings Calculation Tool (SCT) to develop
consistent ex ante estimates of savings for each Activity where energy efficiency
activities are implemented as part of this BPA. For larger and more complex
projects, review of the project files may be required to ensure that ex ante
estimates are reasonable. We anticipate needing to supplement tracking system
data with information gained from paper files and questioning of program staff in
some cases.
Estimation of Energy Impacts
For the selected activities where energy efficiency measures or on-site renewable technologies
are installed, the methodology for calculating energy savings will following the corresponding
methodologies described above in Section 7.3.3.3.
7.5.3
Expansion of Sample Savings Estimates to the Population of
Activities
The sample expansion procedures to be used in the evaluation of the activities in this group are
the same as those described for the Building Retrofit and Equipment Replacement group.
Oak Ridge National Laboratory
65
February 9, 2012
8.
Attribution Approach
EECBG projects may have been heavily influenced by existing SEP initiatives, governmental,
utility or other local funding sources. Consistent decoupling of the net effects of SEP or other
funding sources and influences and EECBG will be a very important methodological
consideration for this evaluation effort. Standard practice for assigning attribution involves
consideration of three factors:
The proportion of funding from EECBG (e.g., does EECBG comprise the majority source
of funds used for the Activity?)
The timing of the Activity (e.g., was the Activity already planned, but accelerated
because of EECBG?)
The extent of the Activity (e.g., did the EECBG funding make it possible to “do more” or
make even more efficient choices” than had originally been planned?)
These factors are typical of free-ridership and net-to-gross analysis, but are applied here to
qualitatively assess attribution since the scope of the project does not involve consideration of a
control group or other standard practices more commonly used for a utility program evaluation.
8.1
Source of Attribution Data
The survey instruments will query grant recipients in the sample regarding the amounts,
proportion of total and sources of other non-EECBG funding sources. Then, KEMA plans to
deploy the following three-step process to meet this challenge in the context of evaluations of
individual projects.
1. Definition. The first step of determining attribution of savings impacts to EECBG will be
to define qualitatively, as based on the information that may be included in the PAGE
data, the overall sources of funding and other influences on each sampled Activity.
2. Validation. The second step will be to validate the components of influence, and if used,
the program theory and logic models, with results of the structured interviews with Grant
and Activity Managers and DOE Program Officers.
Oak Ridge National Laboratory
66
February 9, 2012
3. Quantification. The final step in the process will be to quantify the portion of measured
outcomes attributable to subject Activity using techniques that have long been
associated with energy program evaluation, consisting of an analysis of self-reports of
program effects from the sample of the target population.
8.2
Attribution Analysis
The “bottom line” of attributable energy savings will, of necessity, be developed on a slightly
more subjective basis than standard utility program evaluation, where concepts such as freeridership and spill-over are better measured. In this case, PAGE data and the impressions and
opinions of respondents at three levels – DOE Program Officers, the Grant Manager and the
Activity Manager – are the sources of the data for making a determination. Trends in the data
and results will be reviewed and examined to help construct any patterns of behavior that may
inform future program designs.
8.3
Determination of EECBG Attribution
Once the range and scope of incremental effects have been defined, we will employ a
combination of techniques, culminating in a Delphi process to review the information, confirm
and quantify the magnitude of the net effects of the set of EECBG projects within each BPA.
Evaluators of energy efficiency programs have used various types of analytical approaches to
assess and quantify net program effects on adoption of energy efficiency measures and
practices. In this case, we will employ a combination of techniques of historical tracing/case
study development and structured expert judging.
Historical Tracing/Case Study Development. In this case, we will employ a
combination of techniques of historical tracing/case study development and structured
expert judging. This approach relies on an assessment of self-reports of program effects
by targeted market actors and typically involves surveying samples of actual and/or
potential program participants to elicit their assessment of the program’s influence on
their decisions to adopt energy efficiency measures or practices (in this case, Grant
Activity Managers). The questions can be structured to probe the effect of the program
on the timing, extent, and features of the projects in question, as well as the relative
importance of the program versus other decision factors. The responses will then be
processed to develop an attribution score using a transparent algorithm.
Oak Ridge National Laboratory
67
February 9, 2012
Structured expert judging. Structured expert judgment studies assemble panels of
individuals with close working knowledge (in this case DOE Program Officers, Regional
and State Coordinators) of the technology, infrastructure systems, markets, and political
environments addressed by a given energy efficiency program to estimate impacts with
and without the program in place. Structured expert judgment processes employ a
variety of specific techniques to ensure that the participating experts specify and take
into account key assumptions about the specific mechanisms by which the programs
achieve their effects. The Delphi process is the most widely known of this family of
methods.
We will develop attribution scoring algorithms from the collected and secondary data to estimate
net savings impacts. The final determination of attribution – what portion of the estimated
savings can reasonably be attributed to the existence of and funding received from the EECBG
Program - will be determined using a Delphi process with the engineers closest to the Activities
being studied, plus a panel of Subject Matter Experts comprised of senior team members. The
adjustments will be reviewed within each BPA grouping, and an average attribution level – or
realization rate – determined by BPA and for the EECBG program overall.
Oak Ridge National Laboratory
68
February 9, 2012
9.
Conduct Carbon Emissions Reduction Analysis
9.1
Assessment of Carbon Impacts
The assessment of gross carbon dioxide (CO 2) savings will be done for each broad program
category and for the individual indicator activities. Annualized CO 2 reductions achieved as a
result of EECBG-funded efforts will be calculated and reported for each year over the effective
useful lifetime (EUL) of the measures evaluated. When the consumption of energy from fossil
fuel resources is reduced, the CO 2 emissions that would have resulted from burning those fuels
are avoided. Likewise, when renewable energy is used as an alternative to fossil fuels, the CO2
emissions associated with the replaced fuels are avoided.
In this study, the carbon emissions avoided from EECBG-funded energy efficiency and
renewable energy activities will be reported nationally and for each state. The assessment of
gross CO2 savings will be done for each BPA and for the individual activities. Annualized CO 2
reductions achieved as a result of EECBG-funded efforts will be calculated and reported for
each year over the effective useful lifetime (EUL) of the measures evaluated.
The approach to be taken is consistent with recommendations contained in the Model Energy
Efficiency Program Impact Evaluation Guide13 (“the Guide”). As noted in the Guide: “The
methods for determining avoided emissions values for displaced generation range from fairly
straightforward to highly complex. They include both spreadsheet-based calculations and
dynamic modeling approaches with varying degrees of transparency, rigor, and cost. Evaluators
can decide which method best meets their needs, given evaluation objectives and available
resources and data quality requirements.”
For this study, the basic approach selected employs the use of emission factors as follows:
avoided emissionst = (net energy savings)t x (emission factor)t
The emission factor is expressed as mass per unit of energy (e.g., pounds of CO 2 per MWh),
and represents the characteristics of the emission sources displaced by reduced generation
from conventional sources of electricity and non-electrical loads including natural gas, fuel oil
and propane.
13
National Action Plan for Energy Efficiency (2007). Model Energy Efficiency Program Impact Evaluation
Guide. Prepared by Steven R. Schiller, Schiller Consulting, Inc.
Oak Ridge National Laboratory
69
February 9, 2012
Non-base load emissions rates from the US Environmental Protection Agency’s (EPA)
Emissions & Generation Resource Integrated Database (eGRID) 14 will be used to quantify
avoided emissions. Non-base load emission rates have been developed to estimate the
emissions from marginal generation units, which are those most likely to be displaced by energy
efficiency and/or renewable energy programs and projects. The non-base load emission metric
is recommended by EPA for this purpose,15 and is appropriate to the level of analysis called for
in the EECBG evaluation.
14
eGRID2010, the most recent version will be used for this analysis.
E.H. Pechan & Associates, Inc., “The Emissions & Generation Resource Integrated Database for
2010 (eGRID2010) Technical Support Document,” Prepared for the U.S. Environmental
Protection Agency, Office of Atmospheric Programs, Clean Air Markets Division, Washington, DC,
December 2010.
15
Oak Ridge National Laboratory
70
February 9, 2012
10.
Conduct Employment Analysis
10.1
Broad Parameters of Jobs Assessment
The measurement of net annual job impacts will occur at the state-level for each BPA. Those
BPAs containing several heterogeneous program activities will require job impact estimation for
each of those activities.
10.2
Economic Impact Model for identifying Job Impacts
Our proposed approach includes a 51-region (state) REMI Policy Insight simulation model.
Information describing the short-term and long-term project-related effects will be introduced
into this economic model to identify the annual projection of job impacts. This analysis system
has been applied to numerous energy and environmental policy/program analyses. A brief
overview of the REMI model capabilities follows below.
REMI is chosen over other models because it has the relevant economic levers and feedbacks
to handle the types of effects expected to flow from such project spending and energy saving
(generating) technology adoption. The model is a computable, general equilibrium (CGE)
simulation forecasting system of industry-level Activity for 23 different industries (approximating
three-digit NAICS definitions of business Activity) through the year 2050. It is well-specified
through its internal logic or equation set, such that feedbacks among economic stakeholders
(households, businesses and public agency budgets) are captured when more energy-efficiency
and renewable generation investments take place. The feedback mechanisms capture both the
increases and decreases in spending, demand and employment that result from an increase in
spending occurring in a single or multiple industries both within and across geographic regions.
Figure 10-1 portrays the basic concept of what the REMI model captures for a region’s
economic impacts (a region can be a county/state or any combination of county building blocks).
There are five major blocks to a region’s economy (e.g. Output, Labor & Capital Supply, etc.);
each block contains numerous equations, and the arrows depict the feedback between different
components of an economy. In a multi-state model (of 51 regions), one can envision 51
economies, such as in Figure 10-1, which will also exhibit feedback between other states (interregional) for labor flows (commuters) and trade in manufactured goods and in services. Unique
to the REMI model, among the class of competing regional economic impact frameworks
available, is the linkage to the market shares block. Policies or investments that change the
underlying cost-of-doing-business for an industry in region k will affect that industry’s relative
Oak Ridge National Laboratory
71
February 9, 2012
competitiveness (relative to the U.S. average for that industry) and its ability to retain/gain sales
within its own region, elsewhere in the multi-region marketplace, elsewhere in the U.S. and for
non-U.S. trade.
Figure 10-1: REMI Economic Forecasting Model – Basic Structure and Linkages
Investment tied to new
technologies
Output
Population & Labor
Supply
Labor & Capital Demand
Market
Shares
Wages, Prices, & Profits
Energy savings
The REMI model identifies estimates of job impacts (and numerous other economic and
demographic metrics) by comparing the base case16 annual forecast using the above
structure/feedbacks to the annual forecast when energy-related savings/costs or new dollars of
investment are proposed through the alternative forecast. Total economic impacts result from
the direct economic effects of EECBG project investment. The total impact equals the direct
16
The regionally-calibrated software model is delivered with a standard Regional Control forecast out to
2050. This analysis has assumed that forecast is a sufficient long-term representation of the base case
economies.
Oak Ridge National Laboratory
72
February 9, 2012
plus non-direct impacts. Non-direct impacts are sometimes referred to as ripple effect in an
economy. It is the presence of a comprehensive region-specific set of multiplier effects in the
REMI economic simulation model that create additional economic responses once the direct
effects have been introduced. Two economic mechanisms follow as a result of the direct
program effects: changes in consumer demand (often labeled ‘induced’ effects) and changes in
intermediate demand or Business to Business “B2B” (often labeled ‘indirect’ effects). The
REMI model reports a total impact concept reflecting the increases and decreases that occur
across industry sectors and regions, and though it does not report separate induced and indirect
contributions, both are accounted for, and we can segment these post-analysis.
The total economic impacts (stated in terms of net jobs for this study objective) are expressed
as a difference relative to jobs in year t without the program. Figure 10-2 portrays this
relationship.
Oak Ridge National Laboratory
73
February 9, 2012
Figure 10-2: Identifying Economic Impacts in the REMI Framework
What are the
effects of the
Proposed
Action?
The REMI Model
Policy
Action
Baseline values
for all Policy
Variables
Control Forecast
Alternative Forecast
Compare Forecasts
10.3
Translating EECBG Project Direct Effects into Economic
Events
The REMI model will translate the ways in which EECBG dollars affect various segments
through relevant direct effects that exert an influence on the local economy. Relevant direct
effects include those to specific energy customer segments (e.g. change in price, consumption
or both), a region’s economic self-sufficiency (by replacing imported purchases of energy
generating feed stocks/ energy driven components with more locally provided energy
conserving devices/services), and the incremental cost to energy customers and/or government
Oak Ridge National Laboratory
74
February 9, 2012
to achieve these goals. These direct effects, expressed as data inputs, will be developed as part
of the data collection activities described in Section 3. In Figure 10-3 below, the left portion of
the diagram portrays the set of direct effects that are possible with a broad range of energyrelated investments/objectives. The major categories of direct effects associated with energy
policies/investments and their potential to initiate macroeconomic responses are described
below:
Program operations (administrative) spending — dollars spent to operate the state’s
EECBG program and to incentivize to business and household to invest
Household and business savings — dollar savings to businesses and households
(resulting from reductions in energy consumed and (potentially) electric demand),
realized as a result of the EECBG funded project
Household and business cost — additional household and business expenditures
associated with the incremental cost of purchasing energy-efficient equipment/customersited RE systems (generally the total cost of new equipment minus incentives paid by
the program and net of what would otherwise have been spent anyway). They may also
include a ratepayer effect (a benefit in the case of lower rates/avoided costs or a
negative if higher rates result.)
Other spending shifts — shifts in patterns of spending and business sales among
sectors of the state’s economy affecting the flow of dollars into, out of, and within the
state. Included here are “import substitution” effects, new O&M spending requirements
for new technology facilities/systems, as well as potential contraction for the power
generating sector in light of energy-efficiency project uptake.
The “mapping” or translation of the above categories of direct effects into the economic impact
model is depicted in the upper right portion of Figure 10-3. This entails careful delineation of
instances when a new pattern of local demands arising from some or all energy customer
segments represents opportunities for greater reliance on “within region” sales, or none at all.
The latter signals a continued import requirement albeit for an energy-efficient device instead of
imported coal or petroleum feed stocks. Installation and other contractor services are more likely
to be locally provided. Net savings to participating households and businesses (after paying off
equipment investment cost differentials) have a clear pathway into the economic impact model
and subsequent net job impacts. While employment in the industries directly affected by influx
of demand and funding from the EECBG program, other industries such as fossil fuel sectors
Oak Ridge National Laboratory
75
February 9, 2012
may see reductions in demand for their products which may result in a reduction in employment
in that sector.
Figure 10-3: REEM Framework for Energy Impact Analysis
[ Renewable Energy Efficiency Mapping ]
©2005-2011 Economic Development Research Group, Inc.
10.4
Presentation of Job Impacts
The key outputs of the macroeconomic modeling exercise will be presented to show the statelevel job impact process at the BPA or program Activity level. From the model’s outputs, we will
be able to do the following:
Oak Ridge National Laboratory
76
February 9, 2012
Distinguish the time-phase of impacts, e.g. short-term activities, long-term persistent
changes
Distinguish the direct net jobs from the indirect and induced net job impacts
Use the results from attribution analyses by BPA above to estimate the attributable net
job impacts associated with total project investment/implementation
Perform aggregations to harness BPA/ state-level/national level net job impacts from
EECBG projects by each program year to be evaluated
Oak Ridge National Laboratory
77
February 9, 2012
11.
Conduct Analysis of Organizational/Operational
Factors Influencing Outcomes
A unique feature of this evaluation will be the performance of a statistical analysis of factors that
can affect program performance. The objective of the statistical analysis will be to identify key
factors that are significantly related to Activity outcomes. An understanding of the factors related
to successful performance can be helpful to public policy makers, program managers, and other
parties interested in the adoption and effective utilization of energy efficiency and renewable
energy technologies.
11.1
Incorporation of Other Data
The first step in identifying key factors influencing performance will be an examination of past
studies exploring the relationships between various organizational and operational factors and
outcomes achieved by energy efficiency and renewable energy activities. A source of
information will be published proceedings from energy efficiency, renewable energy, and
evaluation conferences. Two key proceedings that will be examined are those associated with
the annual American Council for an Energy-Efficiency Economy (ACEEE) Summer Study and
the biannual International Energy Program Evaluation Conference (IEPEC). Other proceedings
will also be examined, as relevant.
Several major online evaluation databases will also be searched. The best known of these are
the California Measurement Advisory Council (CALMAC) Searchable Database; the Consortium
for Energy Efficiency (CEE) Market Assessment and Program Evaluation (MAPE)
Clearinghouse; and the New York State Energy Research and Development Authority
(NYSERDA) library of New York Energy $mart quarterly and annual evaluation reports.
Other reports and journal articles documenting state, utility, and university studies could provide
useful information and those will be examined as well.
11.2
Regression Modeling
The objective of the task is to identify the characteristics of the program and implementation of
the grant or sub-grant that influenced the performance of a specific grant or sub-grant. The
analysis for this task will be based on a statistical regression model. A regression framework
will allow identification of key organizational and operational characteristics that explain the
Oak Ridge National Laboratory
78
February 9, 2012
relative level of savings per grant dollar with statistical rigor and quantify the relationship
between those characteristics and grant performance.
The factors for the regression analysis will be based on the findings from the above-described
literature review and the experience of ORNL advisors and senior KEMA evaluation staff
regarding performance-affecting factors from previous evaluations. An initial meeting with the
project team was held to develop a preliminary list of variables for consideration. It is important
to conceptualize this early in the project so that data collection instruments will capture the
necessary information to feed the model. The precise nature of key variables will be determined
for each sampled Activity through the review of Activity records and direct interviews with the
involved parties (project manager, grant manager). The data for this analysis will also come
from outside sources, such as the US Census.
Key variables could include design, implementation, operational, technical assistance, market,
and psychosocial factors. Table 11-1 below lists some examples of factors to be examined in
this study within six potential categories of influence.
Oak Ridge National Laboratory
79
February 9, 2012
Table 11-1: Example Factors Influencing EECBG Outcomes
Category of
Factors
Design Factors
Example Factors
Example Definitions
Use of best practice features; identification of
an appropriate baseline; consideration of
market potential; approach to market
segment being served; etc.
Number of best practice
features: numeric;
Articulation of baseline: yes
or no (1 or 0)
Implementation
Factors
Presence of a champion/strong leadership;
staffing resources; prior experience; use of
vendors; marketing types; marketing
schedule; incentive amounts and types;
measure types; educational features;
customer service features (e.g., presence of
800 number or other tech support); continuity
of staff; type of staff (temporary, part-time,
full-time); etc.
Presence of champion: yes
or no (1 or 0); Number of
staff: numeric; Marketing
schedule: 3=frequent,
2=fairly frequent,
1=minimal, 0=none
Operational
Factors
Data processing approach; frequency of
reporting; presence of follow-up or customer
satisfaction feature; structure of back office
(one location, multiple locations); target
budget vs. actual budget; etc.
Data processing approach:
2=web-based data entry by
implementers, 1=manual
emailing of spreadsheets or
other format for data entry
by program managers, 0=no
use of forms or tables,
simply text, email or verbal
status reports
Technical
Assistance
Factors
Training features for internal implementation
staff; training of trade allies; certification
requirements; source of training; etc.
Training features: 2=strong
training program, good
frequency and locations
(accessible), 1=good
training, not as frequent
and/or accessible; 0=no
training
Market Factors
Target population; number of segments
targeted; unemployment rate; educational
levels of target market; number of competing
programs; accessibility of measures; etc.
Unemployment rate: use
state values or municipal if
known/available
Psychosocial
Factors
Alignment of messaging with audience; type
of message; customization to different
segments within the market (non-English
speaking groups, low income); etc.
Alignment of message:
1=good alignment, 0=poor
alignment
Oak Ridge National Laboratory
80
February 9, 2012
Each Activity will be weighted to reflect its relative share of overall EECBG funding. Statistical
analyses can be performed for each BPA separately as well as for the combined set of
programs areas under study.
11.3
Other Statistical Analysis Approaches
The types of statistical analyses used will include at minimum a regression analysis, as
discussed above. Other applicable statistical analyses, such as a correlation analysis, will be
determined through the literature review, discussed with ORNL/DOE, and used as appropriate.
Oak Ridge National Laboratory
81
February 9, 2012
12.
Reporting and Presentation
12.1
Interim Reports
During the EECBG project, we will provide ORNL and its advisors with memos summarizing our
interim findings. Specifically, we will provide summary memos upon completion of the following
key tasks:
Sample Design and Selection
Energy Savings Analysis
KEMA will meet with the ORNL team either in-person or by telephone to discuss the results and
solicit comments. Following that interaction, KEMA will revise the interim memos as needed.
12.2
Draft and Final Reports and Presentation
Upon completion of the study, we will provide ORNL with a draft report for comment. KEMA will
incorporate ORNL’s comments and prepare a revised draft report for distribution for peer review
by an independent panel of evaluation experts and key stakeholders including DOE. We will
present the results during an in-person meeting with DOE, ORNL and the Peer Review Panel.
Comments and direction received during and following the presentation will be incorporated into
the final written report.
Oak Ridge National Laboratory
82
February 9, 2012
13.
Project Management and Administration
The national evaluation of EECBG is a complex study that necessitates a structured and
disciplined project management approach. The management fundamentals employed are
based on the Project Management Body of Knowledge (PMBOK), as published by the Project
Management Institute (PMI). Due to the magnitude of this project, the evaluation team includes
four key resources that provide overall management and administrative support: 1.) Principle in
Charge (PIC), 2.) Project Manager, 3) .Deputy Project Manager, and 4) Project Coordinator.
The Project Manager reports directly to the PIC. This system allows for tight project control and
the benefit of experienced oversight via the PIC who can help predict and address project risks
and issues while guiding the project to success. The Deputy Project Manager, a PMI certified
Project Management Professional (PMP), coordinates and manages the daily operations and
activities with the support of the Project Coordinator.
13.1
Description of Project Management Tools
KEMA adopts industry best practices into its standard project planning and management
approach, refining processes that address budget management, client communication, risk
management, schedule adherence, scope supervision, personnel guidance, and quality
assurance/control to name a few. The suite of tools with which KEMA manages includes both
commercial as well as proprietary systems:
MS Project - Project schedule, resource management, sub contractor management, team
communication
SharePoint - Communication, collaboration, file share, resource management
KEMA Project Tracker 1.0 - Budget management, resource management, forecasting
KEMA RAM – Internal resource management, Resource Allocation Management (RAM)
Oracle - Resource management
Risk Memo - Risk identification and mitigation strategies
Weekly Status Updates – Client / team communication
Oak Ridge National Laboratory
83
February 9, 2012
13.2
Schedule
KEMA is managing and tracking the evaluation schedule and resources in MS Project.
Appendix B provides a Gantt chart showing the major tasks along with their durations and
associated interdependencies. Upon approval of the final Work Plan, the major tasks shown in
the schedule will be broken down into multiple subtasks to allow for more accurate and tighter
project control.
13.3
Budget Allocation and Expected Spend Rate
KEMA’s Project Manager will track project time and expenses on a weekly basis, using reports
from individual consultants and standard work-in-progress (WIP) reports from KEMA’s Oracle
reporting systems. The results of the project budget expenditures will be tracked and reported in
our progress reports (including % spent vs. % project time elapsed), including any mitigating
actions required to conform to estimates. In addition, each monthly status report will provide a
three-month projection of the expected spend rate for the project. By employing Microsoft
Project in conjunction with our internal project reporting systems, KEMA will utilize available
Earned Value Management (EVM) techniques and internal budget tracking tools to measure the
project’s status.
13.4
Project Team and Responsibilities
The KEMA Team is composed of highly experienced, nationally-recognized individuals who
have management responsibility for the project, supported by a set of subject matter experts
assigned to Broad Program Areas and analytical components consistent with their expertise.
The Senior KEMA Management Team will consist of the following individuals:
Miriam Goldberg, PhD, Officer in Charge and Senior Technical Advisor. As the
official liaison to KEMA’s top management, Dr. Goldberg will ensure that DOE/ORNL are
completely satisfied with the quality of our work and dedication of our staff and
subcontractors. Dr. Goldberg will lead the sample design and statistical analysis for the
project.
Luisa M. Freeman, MSc., Principal-in-Charge, will oversee the management and
execution of the project and will be responsible for quality assurance. Ms. Freeman will
lead the survey design and implementation.
Oak Ridge National Laboratory
84
February 9, 2012
Shawn Intorcio, Project Manager, will be responsible for the day-to-day management
and oversight of the technical work under this contract.
Justin Holtzman, Deputy Project Manager, PMP, will assist Ms. Intorcio in the
execution of the project and will be responsible for maintaining the data necessary for
ensuring the project is on track and within budget.
This project will coordinate closely with work proceeding under the National SEP Evaluation.
KEMA leaders for that project, Kathleen Gaffney and Tim Pettit, will communicate regularly
about the two projects so that any cost efficiencies can be identified, and that technical solutions
might be shared as appropriate.
Oak Ridge National Laboratory
85
February 9, 2012
13.4.1
Project Organization Chart
Figure 13-1: Project Organization Chart
Oak Ridge National Laboratory
86
February 9, 2012
Appendix A – Risk Management Memo
A.
To:
Colleen Rizy, Martin Schweitzer, Rick
Schmoyer, Joel Eisenberg - ORNL
From:
Luisa Freeman and the
KEMA Evaluation Team
Subject:
EECBG Evaluation Budget/Quality Risk
Mitigation Strategies - Final
Copy to:
Nick Hall, TecMarket Works
Date:
November 30, 2011
This memorandum summarizes the EECBG Team’s risk mitigation strategies that were
discussed on October 19, 2011 at the KEMA Fairfax offices. The strategies below reflect
comments received from ORNL and its advisors regarding:
Quickly identifying risks
Promptly facilitating the appropriate mitigation strategy
Ensuring that all identified risks are managed in such a manner that no reduction in
rigor level or quality of methods or results occurs.
Further, by identifying and addressing these potential risks, the team will be able to better
manage the project budget. Comments received on this memo will be addressed and a final
strategy produced to serve as an action plan if any of the identified risks materialize.
Identified Risk
RISK 1. (Administrative Risk)
Delays in completion of the ICR
process may result in a delay to
the project such that it cannot be
completed within the current
schedule.
Mitigation Strategies
1. KEMA will allocate the resources necessary to
complete the ICR package in a timely manner.
2. KEMA will request an Emergency ICR in addition
to the standard ICR.
3. KEMA will submit the survey instrument(s) to OMB
at the same time as issuing the 30-Day Public
Notice in anticipation of limited to no comments.
4. KEMA will identify tasks that can proceed while
awaiting ICR approval.
5. KEMA will reserve remaining project resources
until after ICR approval is obtained.
Oak Ridge National Laboratory
87
February 9, 2012
Identified Risk
Mitigation Strategies
RISK 2. (Administrative Risk)
DOE may elect to postpone the
completion date for some grants,
thus potentially limiting the
number of grants available for
sampling within the necessary
timeframe.
RISK 3. (Administrative Risk)
Subcontractor availability may be
compromised due to need to stop
work until ICR process is
approved.
1. KEMA will draw the sample from activities across
both grants and sub-grants. Those activities that
are completed by the time of ICR approval will be
kept in the sample.
2. For incomplete activities, we will determine
whether enough information is available from the
desk review to estimate energy savings.
1. KEMA may elect to proceed with desk reviews
during the ICR review process, maintaining the
original schedule.
2. KEMA will shift work to other subcontractors and/or
KEMA staff to accomplish the tasks on time and
within budget.
3. KEMA has duplicate skill sets to allow for this
contingency.
1. KEMA will pursue sub-grant data from DOE
Golden office to determine its potential usefulness
for categorization of sub-grants.
2. KEMA will leverage the grant close-out tagging
process to identify sub-areas and activities at a
level necessary for evaluation purposes.
3. KEMA will explore the extent to which DOE can
incorporate State sub-grant data into PAGE, thus
eliminating the need to devote project time and
resources to the creation of a separate State subgrant database.
4. The sampling procedure used will identify the state
sub-grants for which additional detail will be
needed, eliminating the need to gather detailed
information for sub-grants made by all the states.
1. The sample will include substitutes, in addition to
the primary sample, to achieve the targeted number
of completed evaluations by program area.
2. KEMA will obtain data that will allow the evaluability
of each primary sample Activity to be determined
using identified protocols, before moving to a
substitute.
3. To limit the need for such a substitution, extra effort
will be devoted to assuring evaluability of the
certainty selections.
RISK 4. (Data
Availability/Adequacy Risk) A
lack of data on EECBG State subgrants may result in higher costs
for creating a sampling frame.
RISK 5. (Data Adequacy Risk)
Anticipated project-level data are
not available to support energy
savings calculation (“evaluability”).
Oak Ridge National Laboratory
88
February 9, 2012
Identified Risk
Mitigation Strategies
RISK 6. (Data Availability Risk)
Attribution estimates could be
compromised if data and planning
assumptions are not available.
1. KEMA will devote adequate project resources
toward identifying the best person for each project
to discuss attribution issues.
2. KEMA will make use of other resources such as
studies being conducted when there is overlap so
EECBG evaluation funding might be leveraged.
Examples include an ongoing study at LBL, and
State EECBG evaluations being conducted
independently.
3. The tagging process will be used to capture other
funding sources leveraged at a time when such
knowledge should be current and up-to-date.
4. KEMA will work with ORNL/DOE to establish level
of effort ceiling per Activity after which the KEMA
team will move to a simpler alternative means of
assigning attribution for that Activity.
5. In consultation with ORNL/DOE, KEMA will specify
the alternative “last resort” means of assigning
attribution. This method will be used only for cases
where the planned attribution analysis is not
possible.
6. KEMA will apply alternatives such as establishing a
range and using the midpoint, borrowing findings
from similar PAs, or some combination of these.
1. KEMA will have DOE Project Officers review the
contact list that is developed to avoid “false starts.”
2. KEMA will use the tagging process at project
closure to identify the best contacts for on-going
projects.
3. KEMA will devote adequate resources to make sure
we identify the best person from each project to
respond to evaluation questions.
1. The desk review process will identify the specific
level of detail required for each Activity in order to
estimate energy savings.
2. A calculation tool can be developed and employed
once sample points are known (quantities and other
variables to be inserted later).
3. KEMA will leverage the Savings Calculation Tool
developed under SEP for appropriate activities.
RISK 7. (Methodology Risk)
Multiple calls may be needed to
identify the appropriate individuals
to interview for each sampled
Activity.
RISK 8. (Methodology Risk) If
engineering estimates of energy
savings are not done efficiently
and accurately in a reasonable
amount of time, costs will escalate
and the quality of the study will be
at risk.
Oak Ridge National Laboratory
89
February 9, 2012
Identified Risk
Mitigation Strategies
RISK 9. (Methodology Risk) The
range of savings and non-savings
related activities among some
grants may compromise the ability
to extrapolate findings from
individual activities to Broad
Program Areas.
1. KEMA will make every reasonable attempt within
the budget to develop a rich database for the
sampled projects from which to estimate impacts
and success factors.
2. An approach for extrapolating savings from
activities to Broad Program Areas will take into
account the possibility that some grants may have
high percentages of non-savings dollars.
3. KEMA will make use of people experienced in
sample expansion to determine a rigorous
approach to extrapolation.
4. KEMA will outline the process for aggregating
Activity results to sub-areas, Broad Program
Areas, and cumulative levels in the Work Plan.
Oak Ridge National Laboratory
90
February 9, 2012
B.
Appendix B– Project Schedule
Key Tasks
Oct
2011
Nov
Dec
Jan
Feb
Mar
April
May
June
Development of Workplan
Sample Design
Wave 1 - Sample Selection
Data Base Development
Development of Survey Instruments
ICR Review Process
Assess the Feasibility to Perform Non-ICR Analysis
Conduct Non-ICR Analysis (if necessary)
Wave 1 - Desk Reviews
Refresh Data Base and Add TAG Results
Wave 2 - Sample Selection
Wave 2 - Desk Reviews
Savings Analysis
Job Creation Analysis
Emissions Analysis
Analysis of Performance Factors
Activity Aggregation
Draft Report
Final Presentation
Final Report
Oak Ridge National Laboratory
2012
July
Aug
Sept
Oct
Nov
Dec
3-Nov
10-Nov
22-Dec
91
February 9, 2012
File Type | application/pdf |
File Title | 2010 report-proposal template for Windows 2007 |
Author | Caren Yost |
File Modified | 2012-08-17 |
File Created | 2012-08-17 |